Open-Source, Centralized Memory Store for AI Teams
SuperContext eliminates context silos in AI-powered development. When your team uses Claude Code, Cursor, or Windsurf, everyone stays in sync with a centralized memory store that connects to your existing tools via MCP. It's the open-source solution for sharing AI context across your development workflow.
- Scalable Context Management: Efficiently manage and store large volumes of contextual data.
- Flexible Integration: Easily integrate with your existing AI agents and applications.
- Extensible Architecture: Customize and extend the engine to meet your specific needs.
- High Performance: Optimized for speed and efficiency, ensuring low-latency access to context.
To get started with Supercontext, clone the repository and install the dependencies:
git clone https://github.com/rooveterinary/supercontext.git
cd supercontext
bun install
bun devThe web client should be available at http://localhost:3000 and the MCP server should be available at http://localhost:3002/mcp.
Detailed usage instructions can be found in the respective README.md files within the apps directory.
You can add the Supercontext MCP server to any AI tool that supports MCP. For example, in Cursor, Roo Code, or other compatible tools, you would add the following configuration to your settings:
{
"mcpServers": {
"supercontext": {
"type": "streamable-http",
"url": "<YOUR_SUPERCONTEXT_MCP_ENDPOINT>/mcp",
"headers": {
"x-api-key": "<YOUR_SUPERCONTEXT_API_KEY>"
}
}
}
}We welcome contributions from the community! If you'd like to contribute, please fork the repository and submit a pull request.
Supercontext is licensed under the MIT License.
