A suite of Model Context Protocol (MCP) servers for DevOps tools and technologies, enabling AI assistants and automation to interact with modern infrastructure and deployment technologies.
This repository serves as the central hub and MCP registry for all MCP servers built to support a wide range of DevOps tools and workflows. Each MCP server provides a standardized interface for AI agents and automation platforms to interact with specific tools, services, or platforms, making it easier to integrate, automate, and extend DevOps operations across diverse environments.
- Kubernetes Package Management: For managing Kubernetes workloads, the Helm MCP Server enables AI-driven Helm chart operations and best practices.
- CI/CD, Build & Release: Dedicated MCP servers (e.g., ArgoCD MCP Server, Argo Rollout MCP Server, Jenkins MCP Server) provide automation and orchestration for continuous integration, delivery, and deployment pipelines.
- Cloud Orchestration: Dedicated MCP Servers like Terraform MCP Server provide comprehensive infrastructure as code management with secure command execution, semantic document search, and intelligent document ingestion for AWS, Azure, Google Cloud, and more.
- Observability & Monitoring: For monitoring and observability, specialized MCP servers will be available for Prometheus, the TICK stack (Telegraf, InfluxDB, Chronograf, Kapacitor), and other monitoring solutions.
The vision for TalkOps MCP Servers is to offer a modular, extensible, and unified platform where each DevOps domain—whether infrastructure as code, CI/CD, cloud orchestration, or observability—can be managed through a dedicated MCP server. This approach empowers AI agents and automation tools to deliver intelligent, context-aware DevOps workflows, regardless of the underlying technology stack.
Each table lists MCP servers by DevOps domain. Use Quick Install for the recommended setup (Docker preferred; CLI if no Docker), README for full documentation, and Config for MCP client configuration in that server's README.
| Server Name | Description | Quick Install | README | Config | Video |
|---|---|---|---|---|---|
| Helm MCP Server | Search charts, install/upgrade/rollback releases, validate manifests, monitor deployments. Full Helm lifecycle with dry-run and multi-cluster support. | docker run -p 8765:8765 -v ~/.kube/config:/app/.kube/config:ro talkopsai/helm-mcp-server:latest |
README | Config | ▶ Watch |
| Server Name | Description | Quick Install | README | Config | Video |
|---|---|---|---|---|---|
| ArgoCD MCP Server | Manage ArgoCD applications, sync deployments, onboard repositories, create projects, debug with guided workflows. GitOps with credential isolation. | docker run -p 8770:8770 -e ARGOCD_SERVER_URL=... -e ARGOCD_AUTH_TOKEN=... -e MCP_ALLOW_WRITE=true talkopsai/argocd-mcp-server:latest |
README | Config | ▶ Overview ▶ Demo |
| Argo Rollout MCP Server | Convert K8s Deployments to Argo Rollouts, orchestrate canary/blue-green deployments, promote/pause/abort, integrate AnalysisTemplates. Zero-YAML onboarding with built-in playbooks. | docker run -p 8768:8768 -v ~/.kube:/app/.kube:ro -e K8S_KUBECONFIG=/app/.kube/config talkopsai/argo-rollout-mcp-server:latest |
README | Config | — |
| Server Name | Description | Quick Install | README | Config | Video |
|---|---|---|---|---|---|
| Terraform MCP Server | Execute Terraform commands, semantic doc search via Neo4j, document ingestion. Multi-provider AI (OpenAI, Anthropic, Ollama). | cd src/terraform-mcp-server && uv pip install -e . && uv run agents-mcp-server |
README | Config | — |
| Server Name | Description | Quick Install | README | Config | Video |
|---|---|---|---|---|---|
| Agents Central Registry | Discovery hub for Google A2A agents and MCP servers. Natural language queries, capability matching, real-time registry updates. | cd src/agents-mcp-server && uv pip install -e . && uv run -m agents_mcp_server |
README | Config | — |
Contributions are welcome! Please open an issue or pull request on the project repository.
This project is licensed under the Apache-2.0 License.
- Open an issue on GitHub
- Join our Discord server
- See each server's README for documentation and guides