Are Deep Neural Networks Dramatically Overfitted?
Published:Mar 14, 2019 00:00
•1 min read
•Lil'Log
Analysis
The article raises a fundamental question about the generalization ability of deep neural networks, given their high number of parameters and potential for perfect training error. It highlights the common concern of overfitting in deep learning.
Key Takeaways
Reference
“Since a typical deep neural network has so many parameters and training error can easily be perfect, it should surely suffer from substantial overfitting. How could it be ever generalized to out-of-sample data points?”