Search:
Match:
1 results

Analysis

This paper addresses the problem of model density and poor generalizability in Federated Learning (FL) due to inherent sparsity in data and models, especially under heterogeneous conditions. It proposes a novel approach using probabilistic gates and their continuous relaxation to enforce an L0 constraint on the model's non-zero parameters. This method aims to achieve a target density (rho) of parameters, improving communication efficiency and statistical performance in FL.
Reference

The paper demonstrates that the target density (rho) of parameters can be achieved in FL, under data and client participation heterogeneity, with minimal loss in statistical performance.