Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
In this talk, we introduce an asynchronous decentralized accelerated stochastic gradient descent algorithm for decentralized stochastic optimization. Considering communication and synchronization costs are the major bottlenecks, we attempt to reduce these costs via randomization techniques. Our major contribution is to develop a class of accelerated randomized decentralized algorithms for solving general convex composite problems. We establish O(1/ε) (resp., O(1/√ε)) communication complexity and O(1/ε2) (resp., O(1/ε)) sampling complexity for solving general convex (resp., strongly convex) problems. It worths mentioning that our proposing algorithm only sublinear depends on the Lipschitz constant if there is a smooth component presented in the objective function.
Gosia Lazuka, Andreea Simona Anghel, et al.
SC 2024
Natalia Martinez Gil, Dhaval Patel, et al.
UAI 2024
Shubhi Asthana, Pawan Chowdhary, et al.
KDD 2021
Pavithra Harsha, Ali Koc, et al.
INFORMS 2021