Scalable Class-Incremental Learning with Parametric Neural Collapse

Research Paper#Class-Incremental Learning, Neural Collapse, Knowledge Distillation🔬 Research|Analyzed: Jan 4, 2026 00:00
Published: Dec 26, 2025 03:34
1 min read
ArXiv

Analysis

This paper addresses the challenges of class-incremental learning, specifically overfitting and catastrophic forgetting. It proposes a novel method, SCL-PNC, that uses parametric neural collapse to enable efficient model expansion and mitigate feature drift. The method's key strength lies in its dynamic ETF classifier and knowledge distillation for feature consistency, aiming to improve performance and efficiency in real-world scenarios with evolving class distributions.
Reference / Citation
View Original
"SCL-PNC induces the convergence of the incremental expansion model through a structured combination of the expandable backbone, adapt-layer, and the parametric ETF classifier."
A
ArXivDec 26, 2025 03:34
* Cited for critical analysis under Article 32.