Search:
Match:
1 results

Analysis

This paper addresses the challenges of class-incremental learning, specifically overfitting and catastrophic forgetting. It proposes a novel method, SCL-PNC, that uses parametric neural collapse to enable efficient model expansion and mitigate feature drift. The method's key strength lies in its dynamic ETF classifier and knowledge distillation for feature consistency, aiming to improve performance and efficiency in real-world scenarios with evolving class distributions.
Reference

SCL-PNC induces the convergence of the incremental expansion model through a structured combination of the expandable backbone, adapt-layer, and the parametric ETF classifier.