Federated Learning with L0 Constraint for Sparsity

Research Paper#Federated Learning, Sparsity, L0 Constraint, Probabilistic Gates🔬 Research|Analyzed: Jan 3, 2026 19:16
Published: Dec 28, 2025 20:33
1 min read
ArXiv

Analysis

This paper addresses the problem of model density and poor generalizability in Federated Learning (FL) due to inherent sparsity in data and models, especially under heterogeneous conditions. It proposes a novel approach using probabilistic gates and their continuous relaxation to enforce an L0 constraint on the model's non-zero parameters. This method aims to achieve a target density (rho) of parameters, improving communication efficiency and statistical performance in FL.
Reference / Citation
View Original
"The paper demonstrates that the target density (rho) of parameters can be achieved in FL, under data and client participation heterogeneity, with minimal loss in statistical performance."
A
ArXivDec 28, 2025 20:33
* Cited for critical analysis under Article 32.