Click any tag below to further narrow down your results
Links
This article explains the importance of memory in AI agents, focusing on three types: session memory, user memory, and learned memory. It explores how learned memory allows agents to improve their performance over time by retaining valuable insights and adapting to user needs.
This article presents the Titans architecture and MIRAS framework, which enhance AI models' ability to retain long-term memory by integrating new information in real-time. Titans employs a unique memory module that learns and updates while processing data, using a "surprise metric" to prioritize significant inputs. The research shows improved performance in handling extensive contexts compared to existing models.