MCP (Model Context Protocol) server for building Databricks projects with AI coding assistants like Claude Code and Cursor.
High-level, AI-assistant-friendly tools for Databricks operations. Organized by product line for scalability.
databricks-mcp-core/ # Pure Python library
├── unity_catalog/ # Catalogs, schemas, tables
├── compute/ # Execution contexts
├── spark_declarative_pipelines/ # Pipeline management & workspace files
├── agent_bricks/ # Agent Bricks (future)
└── dabs/ # DAB generation (future)
databricks-mcp-server/ # MCP protocol wrapper
├── server.py # FastAPI + SSE
└── tools/ # MCP tool wrappers
# Using uv (recommended)
uv venv
uv pip install -e databricks-mcp-core
uv pip install -e databricks-mcp-server
# Or using pip
pip install -e databricks-mcp-core
pip install -e databricks-mcp-server
# Configure authentication (use your Databricks profile)
export DATABRICKS_CONFIG_PROFILE=your-profile
# or set DATABRICKS_HOST and DATABRICKS_TOKEN- Install packages (see above)
- Configure authentication (uses DEFAULT profile from
~/.databrickscfgif not set) - Configure Claude Code MCP settings (see below) - no need to manually start server!
- Ask Claude to list your catalogs, create tables, run pipelines, etc.
from databricks_mcp_core.unity_catalog import catalogs, schemas, tables
from databricks.sdk.service.catalog import ColumnInfo, TableType
# List catalogs (returns List[CatalogInfo])
all_catalogs = catalogs.list_catalogs()
for catalog in all_catalogs:
print(catalog.name)
# Create a table (returns TableInfo)
table = tables.create_table(
catalog_name="main",
schema_name="default",
table_name="my_table",
columns=[
ColumnInfo(name="id", type_name="INT"),
ColumnInfo(name="name", type_name="STRING")
],
table_type=TableType.MANAGED
)All functions use the official databricks-sdk and handle authentication automatically.
Unity Catalog (11):
- Catalogs: list_catalogs, get_catalog
- Schemas: list_schemas, get_schema, create_schema, update_schema, delete_schema
- Tables: list_tables, get_table, create_table, delete_table
Compute (4):
- create_context, execute_command_with_context, destroy_context, databricks_command
Spark Declarative Pipelines (15):
- Pipeline Management (9): create_pipeline, get_pipeline, update_pipeline_config, delete_pipeline, start_pipeline_update, validate_pipeline, get_pipeline_update_status, stop_pipeline, get_pipeline_events
- Workspace Files (6): list_pipeline_files, get_pipeline_file_status, read_pipeline_file, write_pipeline_file, create_pipeline_directory, delete_pipeline_path
Synthetic Data Generation (3):
- get_synth_data_template, write_synth_data_script_to_workspace, generate_and_upload_synth_data
Add to .claude/.mcp.json:
{
"mcpServers": {
"ai-dev-kit": {
"command": "uv",
"args": ["run", "python", "-m", "databricks_mcp_server.stdio_server"],
"transport": "stdio"
}
}
}Note: The server runs automatically when Claude needs it - no manual startup required!
If not using uv, use:
{
"mcpServers": {
"ai-dev-kit": {
"command": "python",
"args": ["-m", "databricks_mcp_server.stdio_server"],
"transport": "stdio"
}
}
}Then ask Claude to interact with your Databricks workspace!
- databricks-mcp-core README - Core package details
- databricks-mcp-server README - Server configuration
© 2025 Databricks, Inc. All rights reserved. The source in this notebook is provided subject to the Databricks License [https://databricks.com/db-license-source].
© 2025 Databricks, Inc. All rights reserved. The source in this project is provided subject to the Databricks License [https://databricks.com/db-license-source]. All included or referenced third party libraries are subject to the licenses set forth below.
| Package | License | Copyright |
|---|---|---|
| databricks-sdk | Apache License 2.0 | Copyright (c) Databricks, Inc. |
| pydantic | MIT License | Copyright (c) 2017 Samuel Colvin |
| fastapi | MIT License | Copyright (c) 2018 Sebastián Ramírez |
| uvicorn | BSD 3-Clause License | Copyright © 2017-present, Encode OSS Ltd. |
| sse-starlette | BSD 3-Clause License | Copyright (c) 2020, sysid |
| python-dotenv | BSD 3-Clause License | Copyright (c) 2014, Saurabh Kumar |