Transformers over Directed Acyclic Graphs
Yuankai Luo, Veronika Thost, et al.
NeurIPS 2023
Bilevel optimization has recently regained interest owing to its applications in emerging machine learning fields such as hyperparameter optimization, meta-learning, and reinforcement learning. Recent results have shown that simple alternating (implicit) gradient-based algorithms can achieve the same convergence rate of single-level gradient descent (GD) for bilevel problems with a strongly convex lower-level objective. However, it remains unclear whether this result can be generalized to bilevel problems beyond this basic setting. In this paper, we propose a \textsf{G}eneralized \textsf{AL}ternating m\textsf{E}thod for bilevel op\textsf{T}imization (\textsf{GALET}) with a nonconvex lower-level objective that satisfies the Polyak-Łojasiewicz (PL) condition. We first introduce a stationary metric for the considered bilevel problems, which generalizes the existing metric. We then establish that GALET achieves an -stationary metric for the considered problem within iterations, which matches the iteration complexity of GD for single-level smooth nonconvex problems.
Yuankai Luo, Veronika Thost, et al.
NeurIPS 2023
Stephen Obonyo, Isaiah Onando Mulang’, et al.
NeurIPS 2023
Daniel J. Costello Jr., Pierre R. Chevillat, et al.
ISIT 1997
Naga Ayachitula, Melissa Buco, et al.
SCC 2007