Symmetry Teleportation for Accelerated Optimization
Bo Zhao, Nima Dehmamy, et al.
NeurIPS 2022
Molecule optimization is an important problem in chemical discovery and has been approached using many techniques, including generative modeling, reinforcement learning, genetic algorithms, and much more. Recent work has also applied zeroth-order (ZO) optimization, a subset of gradient-free optimization that solves problems similarly to gradient-based methods, for optimizing latent vector representations from an autoencoder. In this paper, we study the effectiveness of various ZO optimization methods for optimizing molecular objectives, which are characterized by variable smoothness, infrequent optima, and other challenges. We provide insights on the robustness of various ZO optimizers in this setting, show the advantages of ZO sign-based gradient descent (ZO-signGD), discuss how ZO optimization can be used practically in realistic discovery tasks, and demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite. Code is available at: https://github.com/IBM/QMO-bench.
Bo Zhao, Nima Dehmamy, et al.
NeurIPS 2022
Jihun Yun, Aurelie Lozano, et al.
NeurIPS 2021
Ben Huh, Avinash Baidya
NeurIPS 2022
Hongyu Tu, Shantam Shorewala, et al.
NeurIPS 2022