Topological Data Analysis on Noisy Quantum Computers
Ismail Akhalwaya, Shashanka Ubaru, et al.
ICLR 2024
Ensuring group fairness in federated learning (FL) presents unique challenges due to data heterogeneity and communication constraints. We propose Kernel Fair Federated Learning (KFFL), a novel algorithmic framework that incorporates group fairness into FL models using the Kernel Hilbert-Schmidt Independence Criterion (KHSIC) as a fairness regularizer. To address scalability, KFFL approximates the KHSIC with random feature maps, significantly reducing computational and communication overhead while achieving group fairness. To address the resulting non-convex composite optimization problem, we propose FedProxGrad, a federated proximal gradient algorithm that guarantees convergence. Through experiments on standard benchmark datasets across both IID and Non-IID settings for regression and classification tasks, KFFL demonstrates its ability to balance accuracy and fairness effectively, outperforming existing methods by comprehensively exploring the accuracy–fairness trade-offs. Furthermore, we introduce KFFL-TD, a time-delayed variant that further reduces communication rounds, enhancing efficiency in decentralized environments. Code is available at github.com/Huzaifa-Arif/KFFL.
Ismail Akhalwaya, Shashanka Ubaru, et al.
ICLR 2024
R. Sebastian, M. Weise, et al.
ECPPM 2022
Jihun Yun, Peng Zheng, et al.
ICML 2019
Gang Liu, Michael Sun, et al.
ICLR 2025