Are Deep Neural Networks Dramatically Overfitted?
Research#deep learning📝 Blog|Analyzed: Jan 3, 2026 06:22•
Published: Mar 14, 2019 00:00
•1 min read
•Lil'LogAnalysis
The article raises a fundamental question about the generalization ability of deep neural networks, given their high number of parameters and potential for perfect training error. It highlights the common concern of overfitting in deep learning.
Key Takeaways
Reference / Citation
View Original"Since a typical deep neural network has so many parameters and training error can easily be perfect, it should surely suffer from substantial overfitting. How could it be ever generalized to out-of-sample data points?"