Find Related products on Amazon

Shop on Amazon

EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs

Published on: 2025-07-07 10:49:18

EM-LLM: Human-inspired Episodic Memory for Infinite Context LLMs This repository contains a version of the code for EM-LLM, published in ICLR 2025: [openreview link]. Quick Links Overview While typical LLMs struggle with processing extensive contexts, the human brain excels at organising and retrieving experiences spanning a lifetime. In this work, we introduce EM-LLM, an architecture that integrates key aspects of human episodic memory and event cognition into LLMs with no fine-tuning, enabling them to handle practically infinite context lengths while maintaining computational efficiency. EM-LLM organises sequences of tokens into coherent episodic events using a combination of Bayesian surprise and graph-theoretic boundary refinement in an online fashion. When needed, these events are retrieved through a two-stage memory process, combining similarity-based and temporally contiguous retrieval for efficient and human-like access to relevant information. Experiments on the LongBench ... Read full article.