Back
Join now
About

Popular Tags

  • typescript
  • react
  • open-source-coding-agent
  • llm
  • ui-components
  • ai-agents
  • shadcn-ui
  • tailwind
  • open-source
  • python

Top Sources

  • github.com
  • clerk.com
  • 1771technologies.com
  • 21st.dev
  • abui.io
  • activepieces.com
  • ai-sdk.dev
  • alash3al.github.io
  • alchemy.run
  • altsendme.com

Browse by Type

  • Tools
  • Code
bookmrks.io - Discovery, refined.
Website faviconarxiv.org
Website preview

Efficient Online Memory Mechanism for Language Models

δ-mem enhances large language models with a compact memory mechanism for better context utilization and performance.

flux
Summary

δ-mem is a lightweight memory mechanism designed to enhance the performance of large language models by efficiently accumulating and reusing historical information. This approach addresses the limitations of simply expanding the context window, which can be costly and ineffective for context utilization.

Key features:

  • Compact online state - Utilizes an $8\times8$ online memory state to store past information.
  • Delta-rule learning - Updates the state matrix using a learning method that improves memory efficiency.
  • Low-rank corrections - Generates corrections to the backbone's attention computation during generation.
  • Performance improvement - Achieves an average score of $1.10\times$ that of the frozen backbone and $1.15\times$ that of the strongest non-δ-mem memory baseline.
  • Memory-heavy benchmarks - Reaches $1.31\times$ on MemoryAgentBench and $1.20\times$ on LoCoMo.

The results indicate that effective memory can be achieved through a compact online state directly integrated with attention computation, without the need for full fine-tuning or backbone replacement.

Comments
No comments yet. Sign in to add the first comment!
Tags
  • ai-agents
    1
  • llm
    1
  • rag
    1