Skip to content

πŸŒ‹ Build AI agents that seamlessly combine LLM reasoning with real-world actions via MCP tools β€” in just a few lines of TypeScript.

License

Notifications You must be signed in to change notification settings

Kong/volcano-sdk

Repository files navigation

CI License npm

πŸŒ‹ Volcano SDK

The TypeScript SDK for Multi-Provider AI Agents

Build agents that chain LLM reasoning with MCP tools. Mix OpenAI, Claude, Mistral in one workflow. Parallel execution, branching, loops. Native retries, streaming, and typed errors.

πŸ“š Read the full documentation at volcano.dev β†’

✨ Features

πŸ€– Automatic Tool Selection

LLM automatically picks which MCP tools to call based on your prompt. No manual routing needed.

🧩 Multi-Agent Crews

Define specialized agents and let the coordinator autonomously delegate tasks. Like automatic tool selection, but for agents.

πŸ’¬ Conversational Results

Ask questions about what your agent did. Use .summary() or .ask() instead of parsing JSON.

πŸ”§ 100s of Models

OpenAI, Anthropic, Mistral, Bedrock, Vertex, Azure. Switch providers per-step or globally.

πŸ”„ Advanced Patterns

Parallel execution, branching, loops, sub-agent composition. Enterprise-grade workflow control.

πŸ“‘ Streaming

Stream tokens in real-time as LLMs generate them. Perfect for chat UIs and SSE endpoints.

πŸ›‘οΈ TypeScript-First

Full type safety with IntelliSense. Catch errors before runtime.

πŸ“Š Observability

OpenTelemetry traces and metrics. Export to Jaeger, Prometheus, DataDog, or any OTLP backend.

⚑ Production-Ready

Built-in retries, timeouts, error handling, and connection pooling. Battle-tested at scale.

Explore all features β†’

Quick Start

Installation

npm install volcano-sdk

That's it! Includes MCP support and all common LLM providers (OpenAI, Anthropic, Mistral, Llama, Vertex).

View installation guide β†’

Hello World with Automatic Tool Selection

import { agent, llmOpenAI, mcp } from "volcano-sdk";

const llm = llmOpenAI({ 
  apiKey: process.env.OPENAI_API_KEY!, 
  model: "gpt-4o-mini" 
});

const weather = mcp("http://localhost:8001/mcp");
const tasks = mcp("http://localhost:8002/mcp");

// Agent automatically picks the right tools
const results = await agent({ llm })
  .then({ 
    prompt: "What's the weather in Seattle? If it will rain, create a task to bring an umbrella",
    mcps: [weather, tasks]  // LLM chooses which tools to call
  })
  .run();

// Ask questions about what happened
const summary = await results.summary(llm);
console.log(summary);

Multi-Agent Coordinator

import { agent, llmOpenAI } from "volcano-sdk";

const llm = llmOpenAI({ apiKey: process.env.OPENAI_API_KEY! });

// Define specialized agents
const researcher = agent({ llm, name: 'researcher', description: 'Finds facts and data' })
  .then({ prompt: "Research the topic." })
  .then({ prompt: "Summarize the research." });

const writer = agent({ llm, name: 'writer', description: 'Creates content' })
  .then({ prompt: "Write content." });

// Coordinator autonomously delegates to specialists
const results = await agent({ llm })
  .then({
    prompt: "Write a blog post about quantum computing",
    agents: [researcher, writer]  // Coordinator decides when done
  })
  .run();

// Ask what happened
const post = await results.ask(llm, "Show me the final blog post");
console.log(post);

View more examples β†’

Documentation

πŸ“– Comprehensive Guides

Contributing

We welcome contributions! Please see our Contributing Guide for details.

Questions or Feature Requests?

License

Apache 2.0 - see LICENSE file for details.

About

πŸŒ‹ Build AI agents that seamlessly combine LLM reasoning with real-world actions via MCP tools β€” in just a few lines of TypeScript.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Packages

No packages published

Contributors 8