Start the API server.
casual-mcp serve --host 127.0.0.1 --port 8000| Option | Default | Description |
|---|---|---|
--host |
127.0.0.1 |
Host to bind |
--port |
8000 |
Port to serve on |
--reload |
false |
Enable auto-reload on file changes |
List configured MCP servers.
$ casual-mcp servers
┏━━━━━━━━━┳━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━┓
┃ Name ┃ Type ┃ Command / Url ┃ Env ┃
┡━━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━┩
│ math │ local │ mcp-servers/math/server.py │ │
│ time │ local │ mcp-servers/time-v2/server.py │ │
│ weather │ remote │ https://localhost:3000/mcp │ │
└─────────┴────────┴───────────────────────────────┴─────┘
List configured models.
$ casual-mcp models
┏━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Name ┃ Provider ┃ Model ┃ Endpoint ┃
┡━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━┩
│ gpt-4.1 │ openai │ gpt-4.1 │ │
│ lm-qwen │ openai │ qwen3-8b │ http://localhost:1234 │
└───────────────────┴──────────┴───────────────────────────┴────────────────────────┘
Interactive toolset management - create, edit, and delete toolsets.
$ casual-mcp toolsets
? Toolsets:
❯ basic - Basic tools for time and math (math, time)
research - Research tools (weather, words, fetch)
──────────────
➕ Create new toolset
❌ Exit
Selecting a toolset shows details and actions:
basic
Description: Basic tools for time and math
Servers:
math: [all tools]
time: current_time
? Action:
❯ ✏️ Edit
🗑️ Delete
← Back
List available tools from all connected MCP servers.
Example output:
$ casual-mcp tools
┏━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
┃ Name ┃ Description ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩
│ math_add │ Add two numbers together │
│ math_multiply │ Multiply two numbers │
│ time_current_time │ Get the current time in a specified timezone │
│ weather_get_forecast │ Get weather forecast for a location │
└────────────────────────┴─────────────────────────────────────────────────────┘
casual-mcp serve --host 127.0.0.1 --port 8000Send full message history for a chat completion.
Request:
{
"model": "gpt-4.1-nano",
"messages": [
{"role": "user", "content": "What does consistent mean?"}
],
"system_prompt": "You are a helpful dictionary assistant.",
"include_stats": true,
"tool_set": "research"
}| Field | Required | Description |
|---|---|---|
model |
Yes | LLM model to use |
messages |
Yes | List of chat messages |
system_prompt |
No | Override the default system prompt for this request |
include_stats |
No | Include usage statistics (default: false) |
tool_set |
No | Name of toolset to limit available tools |
Response with stats:
{
"messages": [...],
"response": "Consistent means...",
"stats": {
"tokens": {
"prompt_tokens": 150,
"completion_tokens": 75,
"total_tokens": 225
},
"tool_calls": {
"by_tool": {"words_define": 1},
"by_server": {"words": 1},
"total": 1
},
"llm_calls": 2
}
}List all available toolsets.
Response:
{
"basic": {
"description": "Basic tools for time and math",
"servers": ["math", "time"]
},
"research": {
"description": "Research tools",
"servers": ["weather", "words", "fetch"]
}
}