A Rust implementation of the Model Context Protocol for building AI-integrated applications.
Want to contribute? Have ideas or feature requests? Come tell us about it on Discord.
A Rust implementation of the Model Context Protocol (MCP) - a JSON-RPC 2.0 based protocol for AI models to interact with external tools and services. Supports both client and server roles with async/await APIs.
- Full MCP Protocol Support: Implements the latest MCP specification (2025-11-25)
- Client & Server: Build both MCP clients and servers with ergonomic APIs
- Multiple Transports: TCP/IP and stdio transport layers
- OAuth 2.0 Authentication: Complete OAuth 2.0 support including:
- Authorization Code Flow with PKCE
- Dynamic client registration (RFC7591)
- Automatic token refresh
- MCP-specific
resourceparameter support - Built-in callback server for browser flows
- Protected resource metadata discovery (RFC 9728)
- Authorization server discovery (RFC 8414 / OpenID Connect)
- Client ID metadata documents for HTTPS client IDs
- Async/Await: Built on Tokio for high-performance async operations
Note: Batch operations in the previous protocol version are not supported.
When a protected resource challenges a request, inspect any WWW-Authenticate header for a
resource_metadata value. Fetch protected resource metadata from that URL, or fall back to
/.well-known/oauth-protected-resource (with optional path suffix) when no challenge is provided.
Use the advertised authorization server issuers to resolve RFC 8414 or OpenID Connect discovery
documents, then use those endpoints for authorization and registration. If the client identifier
is an HTTPS URL, fetch the client ID metadata document at that URL to obtain redirect URIs and
additional client settings.
From ./examples/weather_server.rs
//! Minimal weather server example.
use serde_json::json;
use tmcp::{
Result, Server, ServerCtx, ToolError, ToolResult, mcp_server,
schema::{ClientCapabilities, Implementation, InitializeResult, LoggingLevel, ServerNotification},
tool, tool_params, tool_result,
};
/// Example server.
#[derive(Default)]
struct WeatherServer;
/// Parameters for the weather tool.
// Tool input schema is automatically derived from the struct using serde and schemars.
#[derive(Debug)]
#[tool_params]
struct WeatherParams {
/// City name to query.
city: String,
}
#[derive(Debug)]
#[tool_result]
/// Structured response for the weather tool.
struct WeatherResponse {
/// City name queried.
city: String,
/// Temperature in Celsius.
temperature_c: f64,
/// Human-readable conditions.
conditions: String,
}
/// Structured response for the ping tool.
#[derive(Debug)]
#[tool_result]
struct PingResponse {
/// Ping response message.
message: String,
}
/// Parameters for emitting a log message.
#[derive(Debug)]
#[tool_params]
struct LogParams {
/// Message to include in the server log notification.
message: String,
}
/// Result of emitting a log message.
#[derive(Debug)]
#[tool_result]
struct LogResponse {
/// Whether the log notification was queued.
logged: bool,
}
// The `mcp_server` macro generates the necessary boilerplate to expose methods as MCP tools.
#[mcp_server(initialize_fn = initialize)]
impl WeatherServer {
/// Customize initialize to advertise logging support.
async fn initialize(
&self,
_ctx: &ServerCtx,
protocol_version: String,
_capabilities: ClientCapabilities,
_client_info: Implementation,
) -> Result<InitializeResult> {
let mut init = InitializeResult::new("weather_server")
.with_version(env!("CARGO_PKG_VERSION"))
.with_tools(true)
.with_logging()
.with_instructions("Minimal weather server example");
if !protocol_version.is_empty() {
init = init.with_mcp_version(protocol_version);
}
Ok(init)
}
// The doc comment becomes the tool's description in the MCP schema.
#[tool]
/// Get current weather for a city
async fn get_weather(
&self,
params: WeatherParams,
) -> ToolResult<WeatherResponse> {
// Simulate weather API call
let temperature = 22.5;
let conditions = "Partly cloudy";
Ok(WeatherResponse {
city: params.city,
temperature_c: temperature,
conditions: conditions.to_string(),
})
}
#[tool]
/// Respond with a simple pong
async fn ping(&self) -> ToolResult<PingResponse> {
Ok(PingResponse {
message: "pong".to_string(),
})
}
#[tool]
/// Emit a logging notification using ServerCtx
async fn log_message(&self, ctx: &ServerCtx, params: LogParams) -> ToolResult<LogResponse> {
let payload = json!({ "message": params.message });
ctx.notify(ServerNotification::logging_message(
LoggingLevel::Info,
Some("weather_server".to_string()),
payload,
))
.map_err(|e| ToolError::internal(e.to_string()))?;
Ok(LogResponse { logged: true })
}
}
#[tokio::main]
async fn main() -> Result<()> {
Server::new(WeatherServer::default).serve_stdio().await?;
Ok(())
}Flat tool arguments can be declared directly in the tool signature for multi-argument tools:
#[tool]
async fn add(&self, a: f64, b: f64) -> ToolResult<AddResponse> {
Ok(AddResponse { sum: a + b })
}Single-argument tools remain struct-based by default; opt into flat handling explicitly:
#[tool(flat)]
async fn echo(&self, message: String) -> ToolResult<EchoResponse> {
Ok(EchoResponse { message })
}