Workshop paper

STAR-VAE: Latent Variable Transformers for Scalable and Controllable Molecular Generation

Abstract

The chemical space of drug‑like molecules is vast, motivating the development of generative models that must learn broad chemical distributions, enable conditional generation by capturing structure–property representations, and provide fast molecular generation. Meeting these objectives depends critically on modeling choices, including the probabilistic modeling approach, the conditional generative formulation, the architecture, and the molecular input representation. To address these challenges, we present STAR-VAE (Selfies-encoded, Transformer-based, AutoRegressive Variational Auto Encoder), a scalable latent‑variable framework with a Transformer encoder and an autoregressive Transformer decoder. It is trained on 79 million drug‑like molecules from PubChem, using SELFIES to guarantee syntactic validity. The latent‑variable formulation enables theory‑consistent conditional generation: a property predictor supplies a single conditioning signal that is applied consistently to the latent prior, the inference network, and the decoder. Our contributions are: (i) a Transformer‑based latent‑variable encoder‑decoder model trained on SELFIES representations; (ii) a principled conditional latent‑variable formulation for property‑guided generation; and (iii) efficient finetuning with low‑rank adapters (LoRA) in both encoder and decoder, enabling fast adaptation with limited property and activity data. On the GuacaMol and MOSES benchmarks, our approach matches or exceeds strong baselines, and latent‑space analyses reveal smooth, semantically structured representations that support both unconditional exploration and property‑aware generation. On the Tartarus protein–ligand benchmarks, the conditional model shifts docking‑score distributions toward stronger predicted binding and outperforms the baseline VAE on two of three targets. These results suggest that a modernized, scale‑appropriate VAE remains competitive for molecular generation when paired with principled conditioning and parameter‑efficient finetuning.