Dr. Walid Saba on Natural Language Understanding [UNPLUGGED]
Published:Mar 7, 2022 13:25
•1 min read
•ML Street Talk Pod
Analysis
The article discusses Dr. Walid Saba's critique of using large statistical language models (BERTOLOGY) for natural language understanding. He argues this approach is fundamentally flawed, likening it to memorizing an infinite amount of data. The discussion covers symbolic logic, the limitations of statistical learning, and alternative approaches.
Key Takeaways
- •Dr. Walid Saba is a critic of using large statistical language models (BERTOLOGY) for natural language understanding.
- •He believes this approach is fundamentally flawed.
- •The discussion includes symbolic logic and alternative approaches to NLU.
Reference
“Walid thinks this approach is cursed to failure because it’s analogous to memorising infinity with a large hashtable.”