Parallel AND/OR search for marginal MAP
Radu Marinescu, Akihiro Kishimoto, et al.
AAAI 2020
We introduce a novel approach for optimizing communication efficiency in Federated Learning (FL). The approach leverages sketching techniques in two complementary strategies that exploit similarities on the data transmitted during the FL training process to identify opportunities for skipping expensive communication of updated models in training iterations, and dynamically select subsets of clients hosting diverse models. Our extensive experimental investigation on different models, datasets and label distributions, shows that these strategies can massively reduce downlink and uplink communication volumes by factors order of 100× or more with minor degradation or even increase of the accuracy of the trained model. Also, in contrast to baselines, these strategies can escape suboptimal descent paths and can yield smooth non-oscillatory accuracy profiles for non-IID data distributions.
Radu Marinescu, Akihiro Kishimoto, et al.
AAAI 2020
Helgi I. Ingolfsson, Chris Neale, et al.
PNAS
Divyansh Jhunjhunwala, Shiqiang Wang, et al.
ICLR 2023
Romeo Kienzler, Johannes Schmude, et al.
Big Data 2023