Skip to content

feat(memory): add ConversationMemory trait and InMemoryConversation#1501

Open
Mattbusel wants to merge 1 commit into0xPlaygrounds:mainfrom
Mattbusel:feat/conversation-memory
Open

feat(memory): add ConversationMemory trait and InMemoryConversation#1501
Mattbusel wants to merge 1 commit into0xPlaygrounds:mainfrom
Mattbusel:feat/conversation-memory

Conversation

@Mattbusel
Copy link

Closes #373

This PR implements the ConversationMemory abstraction proposed in issue #373.

Problem

Rig agents are currently stateless per call — chat() takes a Vec<Message> that the caller must manage manually. There is no standard way to plug in a memory backend. Issue #373 has been open since early 2025; the proposed interface was a MemoryStrategy trait with fetch_memory / insert_memory. This PR provides a concrete implementation.

What this adds

rig::memory::ConversationMemory trait

#[async_trait]
pub trait ConversationMemory: Send + Sync {
    async fn push(&self, message: Message);
    async fn history(&self) -> Vec<Message>;
    async fn clear(&self);
    async fn len(&self) -> usize;
    async fn is_empty(&self) -> bool { ... }  // default
}

All methods take &self (not &mut self) so implementations are trivially wrappable in Arc and shareable across agent handles and tokio tasks.

InMemoryConversation

Bounded VecDeque — keeps the last N messages, evicting oldest on overflow:

let memory = InMemoryConversation::new(20); // keep last 20 messages

memory.push(Message::user("What is backpressure?")).await;
memory.push(Message::assistant("Backpressure is …")).await;

let history = memory.history().await;
// pass history to agent.chat(prompt, history).await
  • InMemoryConversation::unbounded() retains every message
  • SlidingWindowMemory type alias for explicit window-size semantics

NoMemory

No-op default — identical to current Agent behaviour. Useful as a default type parameter so memory is opt-in with zero overhead:

pub struct MyAgent<M: ConversationMemory = NoMemory> { ... }

Custom backends

Any storage backend just implements the trait:

#[async_trait]
impl ConversationMemory for RedisMemory {
    async fn push(&self, msg: Message) { /* RPUSH */ }
    async fn history(&self) -> Vec<Message> { /* LRANGE */ vec![] }
    async fn clear(&self) { /* DEL */ }
    async fn len(&self) -> usize { /* LLEN */ 0 }
}

Tests (6)

  • push_and_retrieve — basic store/fetch
  • capacity_evicts_oldest — overflow eviction
  • clear_empties_store
  • unbounded_keeps_all — 200 messages, none dropped
  • no_memory_always_empty — NoMemory is truly stateless
  • shared_via_arc — two Arc handles, writes visible to both

What this does NOT do (intentional scope)

This PR adds the trait and the in-process implementation. Wiring ConversationMemory into AgentBuilder and Agent::chat() is a separate concern that touches more of the API surface — happy to follow up in a second PR once the trait shape is agreed on.

Designed and implemented by Matthew Busel.

Introduces a pluggable memory abstraction for agents — the gap
described in issue 0xPlaygrounds#373.

- ConversationMemory trait: push / history / clear / len / is_empty
  all take &self so implementations are Arc-shareable across tasks
- InMemoryConversation: bounded VecDeque, evicts oldest on overflow
  capacity=0 (or ::unbounded()) retains every message
- SlidingWindowMemory: type alias for explicit window-size semantics
- NoMemory: no-op default, matches current stateless Agent behaviour
- 6 unit tests: push/retrieve, eviction, clear, unbounded, NoMemory,
  Arc shared ownership

Backends (Redis, Postgres, vector store) can implement the trait
without touching rig-core.

Designed and implemented by Matthew Busel.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Proposal for Abstract Encapsulation of Memory Mode Similar to LangChain's Memory

1 participant