Interpolation, Extrapolation and Linearisation (Prof. Yann LeCun, Dr. Randall Balestriero)
Published:Jan 4, 2022 12:59
•1 min read
•ML Street Talk Pod
Analysis
This article discusses the concepts of interpolation, extrapolation, and linearization in the context of neural networks, particularly focusing on the perspective of Yann LeCun and his research. It highlights the argument that in high-dimensional spaces, neural networks primarily perform extrapolation rather than interpolation. The article references a paper by LeCun and others on this topic and suggests that this viewpoint has significantly impacted the understanding of neural network behavior. The structure of the podcast episode is also outlined, indicating the different segments dedicated to these concepts.
Key Takeaways
- •The article discusses the debate around interpolation vs. extrapolation in neural networks.
- •Yann LeCun argues that in high dimensions, neural networks primarily extrapolate.
- •The podcast episode covers linearization, interpolation, and the 'curse' in the context of NNs.
- •The discussion is based on a paper by LeCun and others.
Reference
“Yann LeCun thinks that it's specious to say neural network models are interpolating because in high dimensions, everything is extrapolation.”