Local AI mesh — multiple agents, shared persistent memory, real-time dashboard, browser automation. Runs on your hardware.
-
Updated
Apr 1, 2026 - HTML
Local AI mesh — multiple agents, shared persistent memory, real-time dashboard, browser automation. Runs on your hardware.
Browser-based AI-native chat client built on open standards such as MCP, agent resources, prompts and structured outputs. Client-only runtime with no backend logic, registry or personas. Run agents, tools and workflows via compatible backends with streaming responses and streaming UI rendering.
AI backend providing standardized endpoints for chat completions, streaming responses (SSE), and MCP sampling. Compatible with the Vercel AI SDK streaming protocol and OpenAI-style APIs, with a single runtime handling auth, routing and response mapping across providers.
Add a description, image, and links to the ai-mesh topic page so that developers can more easily learn about it.
To associate your repository with the ai-mesh topic, visit your repo's landing page and select "manage topics."