Everyone's trying vectors and graphs for AI memory. We went back to SQL
Research#llm👥 Community|Analyzed: Jan 3, 2026 16:46•
Published: Sep 22, 2025 05:18
•1 min read
•Hacker NewsAnalysis
The article discusses the challenges of providing persistent memory to LLMs and explores various approaches. It highlights the limitations of prompt stuffing, vector databases, graph databases, and hybrid systems. The core argument is that relational databases (SQL) offer a practical solution for AI memory, leveraging structured records, joins, and indexes for efficient retrieval and management of information. The article promotes the open-source project Memori as an example of this approach.
Key Takeaways
- •LLMs struggle with persistent memory, leading to issues like forgetting user preferences.
- •Various approaches to solve this, such as prompt stuffing, vector databases, and graph databases, have limitations.
- •Relational databases (SQL) offer a practical solution for AI memory by leveraging structured records, joins, and indexes.
- •The open-source project Memori is an example of using SQL for multi-agent memory.
Reference / Citation
View Original"Relational databases! Yes, the tech that’s been running banks and social media for decades is looking like one of the most practical ways to give AI persistent memory."