Research Paper#Class-Incremental Learning, Neural Collapse, Knowledge Distillation🔬 ResearchAnalyzed: Jan 4, 2026 00:00
Scalable Class-Incremental Learning with Parametric Neural Collapse
Published:Dec 26, 2025 03:34
•1 min read
•ArXiv
Analysis
This paper addresses the challenges of class-incremental learning, specifically overfitting and catastrophic forgetting. It proposes a novel method, SCL-PNC, that uses parametric neural collapse to enable efficient model expansion and mitigate feature drift. The method's key strength lies in its dynamic ETF classifier and knowledge distillation for feature consistency, aiming to improve performance and efficiency in real-world scenarios with evolving class distributions.
Key Takeaways
- •Proposes SCL-PNC to address overfitting and catastrophic forgetting in class-incremental learning.
- •Utilizes parametric neural collapse for efficient model expansion.
- •Employs a dynamic ETF classifier and knowledge distillation for improved performance and feature consistency.
- •Demonstrates effectiveness and efficiency on standard benchmarks.
Reference
“SCL-PNC induces the convergence of the incremental expansion model through a structured combination of the expandable backbone, adapt-layer, and the parametric ETF classifier.”