Search:
Match:
2 results

Dr. Patrick Lewis on Retrieval Augmented Generation

Published:Feb 10, 2023 11:18
1 min read
ML Street Talk Pod

Analysis

This article summarizes a podcast episode featuring Dr. Patrick Lewis, a research scientist specializing in Retrieval-Augmented Generation (RAG) for large language models (LLMs). It highlights his background, current work at co:here, and previous experience at Meta AI's FAIR lab. The focus is on his research in combining information retrieval techniques with LLMs to improve their performance on knowledge-intensive tasks like question answering and fact-checking. The article provides links to relevant research papers and resources.
Reference

Dr. Lewis's research focuses on the intersection of information retrieval techniques (IR) and large language models (LLMs).

Research#llm📝 BlogAnalyzed: Jan 3, 2026 07:19

Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer

Published:May 19, 2020 21:34
1 min read
ML Street Talk Pod

Analysis

This article summarizes a podcast episode discussing the Text-to-Text Transfer Transformer (T5) model and its implications for transfer learning in NLP. It covers key aspects like input/output format, architecture, dataset size, fine-tuning, and computational usage. The discussion extends to related topics such as embodied cognition and intelligence measurement. The article provides links to relevant research papers.
Reference

In this episode of Machine Learning Street Talk, Tim Scarfe, Yannic Kilcher and Connor Shorten chat about Large-scale Transfer Learning in Natural Language Processing.