Improved Balanced Classification with Novel Loss Functions
Analysis
Key Takeaways
- •Introduces two new loss function families: Generalized Logit-Adjusted (GLA) and Generalized Class-Aware weighted (GCA) losses for balanced classification.
- •Provides a comprehensive theoretical analysis of consistency for both loss families.
- •Demonstrates that GCA losses offer stronger theoretical guarantees in imbalanced settings due to more favorable scaling of H-consistency bounds.
- •Empirical results show that both GCA and GLA losses outperform existing methods, with GLA performing slightly better overall and GCA excelling in highly imbalanced scenarios.
“GCA losses are $H$-consistent for any hypothesis set that is bounded or complete, with $H$-consistency bounds that scale more favorably as $1/\sqrt{\mathsf p_{\min}}$, offering significantly stronger theoretical guarantees in imbalanced settings.”