Maxence Ernoult, Fabrice Normandin, et al.
ICML 2022
Patents are a valuable source of knowledge, but drafting them is a time-consuming and expensive task. Methods that assist patent generation can provide a two-fold improvement as they can speed up the generation process and suggest to the inventor ideas and claims. Herein, influenced by recent advances in language modeling via multitask learning and prompt engineering, we present Patent Generative Transformer (PGT), a transformer-based language model trained to facilitate patent drafting. Specifically, the model supports three tasks: part-of-patent generation, text infilling, and patent coherence evaluation. PGT complements inventors and assures the fast and successful transition from their input to a coherent patent disclosure taking advantage of its multitasking nature. We show how the model outperforms a collection of task-specific baselines on relevant metrics. We further test the quality of the generated text via blind testing by subject matter experts. Finally, we explore a zero-shot extension of the model showing how to use PGT for generating domain-specific abstracts.
Maxence Ernoult, Fabrice Normandin, et al.
ICML 2022
Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Natalia Martinez Gil, Dhaval Patel, et al.
UAI 2024
Shubhi Asthana, Pawan Chowdhary, et al.
KDD 2021