MemexLLM is a Python library for managing and storing LLM conversations. It provides a flexible and extensible framework for history management, storage, and retrieval of conversations.
- Drop-in Integrations: Add conversation management to your LLM applications with zero code changes using our provider integrations
- Flexible Storage: Choose from memory, SQLite, or bring your own storage backend
- Conversation Management: Organize, retrieve, and manipulate conversation threads with ease
- Memory Management Algorithms: Control conversation context with built-in algorithms (FIFO, summarization, etc.)
- Provider Agnostic: Works with OpenAI, Anthropic, and other LLM providers
- Extensible Architecture: Build custom storage backends and memory management algorithms
pip install memexllm # Basic installation
pip install memexllm[openai] # With OpenAI supportfrom memexllm.storage import MemoryStorage
from memexllm.algorithms import FIFOAlgorithm
from memexllm.core import HistoryManager
# Initialize components
storage = MemoryStorage()
algorithm = FIFOAlgorithm(max_messages=100)
history_manager = HistoryManager(storage=storage, algorithm=algorithm)
# Create a conversation thread
thread = history_manager.create_thread()
# Add messages
history_manager.add_message(
thread_id=thread.id,
content="Hello, how can I help you today?",
role="assistant"
)Add conversation management to your OpenAI application with no code changes:
from openai import OpenAI
from memexllm.integrations.openai import with_history
from memexllm.storage import MemoryStorage
from memexllm.algorithms import FIFOAlgorithm
# Initialize your OpenAI client as usual
client = OpenAI(api_key="your-api-key")
# Add conversation memory with history management
storage = MemoryStorage()
algorithm = FIFOAlgorithm(max_messages=100)
history_manager = HistoryManager(storage=storage, algorithm=algorithm)
client = with_history(history_manager=history_manager)(client)
# Use the client as you normally would - conversations are now managed automatically
response = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Hello, who are you?"}]
)For detailed documentation, including:
- Complete API reference
- Advanced usage examples
- Available storage backends
- Contributing guidelines
- Feature roadmap
Visit our documentation at: https://eyenpi.github.io/MemexLLM/
We welcome contributions! Please see our Contributing Guide for details on how to get started.
This project is licensed under the MIT License.