Here’s a complete README.md you can drop into your repo AI-AGENTS-WITH-MCP. I’ve woven in the uv setup, prerequisites (Python 3.12+), and structured it around your three scenarios + Streamlit app.
Building AI Agents with Model Context Protocol (MCP) and Amazon Bedrock
This repository contains a set of end-to-end examples demonstrating how to build decoupled, agentic AI systems using:
- Amazon Bedrock (Anthropic Claude models)
- Model Context Protocol (MCP) for tool decoupling
- Strands framework for agent orchestration
- uv for fast Python environment + dependency management
- Streamlit for a real-world demo with Zerodha Kite MCP
It follows the concepts from the article “Building AI Agents with MCP and Amazon Bedrock: An Implementation Guide” and showcases:
- Baseline agents with Bedrock models
- Local tool integration
- Single MCP server connection
- Multi-MCP orchestration
- Custom MCP server (Calculator example)
- Streamlit Kite portfolio assistant
- Python 3.12+ installed on your system
- uv (fast Python package installer & runner)
Install uv:
pip install uvVerify:
uv --versionClone the repo:
git clone https://github.com/<your-username>/AI-AGENTS-WITH-MCP.git
cd AI-AGENTS-WITH-MCPCreate a Python 3.12 virtual environment with uv:
uv venv --python 3.12Activate the environment:
Linux/macOS
source .venv/bin/activateWindows (PowerShell)
.venv\Scripts\activateInstall dependencies:
uv pip install -r requirements.txtAI-AGENTS-WITH-MCP/
├── README.md
├── requirements.txt
├── scenario1_single_server/ # Scenario 1: Baseline agent + local tool + doc MCP server
│ ├── baseline_agent.py
│ ├── agent_with_local_tool.py
│ └── agent_with_doc_mcp.py
├── scenario2_multi_server/ # Scenario 2: Multi-MCP orchestration
│ └── multi_server_agent.py
├── scenario3_custom_server/ # Scenario 3: Custom MCP server
│ ├── calculator_server.py
│ └── calculator_client.py
├── kite_streamlit_app/ # Real-world Streamlit + Kite MCP demo
│ └── streamlit_app.py
└── utils/ # Shared helpers
└── streamlit_helpers.py
- Baseline agent with Claude (Bedrock)
- Add a local tool (e.g., Python execution)
- Connect to AWS Documentation MCP server (
awslabs.aws-documentation-mcp-server)
Run:
python scenario1_single_server/agent_with_doc_mcp.pyAgent with access to AWS Documentation + AWS Pricing MCP servers. Use case: Generate SageMaker fine-tuning research report.
Run:
python scenario2_multi_server/multi_server_agent.pyExample of building your own MCP server (Calculator):
Start server:
python scenario3_custom_server/calculator_server.pyRun client:
python scenario3_custom_server/calculator_client.pyInteractive app integrating with Kite MCP and OpenAI GPT-4o.
Run:
streamlit run kite_streamlit_app/streamlit_app.pyEnter your OpenAI API key in the sidebar, log in to Zerodha Kite when prompted, and start asking portfolio questions.
📸 Example UI:
MCP servers are launched using uvx. Example:
uvx --from awslabs.aws-documentation-mcp-server@latest awslabs.aws-documentation-mcp-server.exeYour agent code uses this under the hood.
This repo accompanies the Medium article: ➡️ “Building AI Agents with MCP and Amazon Bedrock: An Implementation Guide”
Follow-up post on LinkedIn coming soon 🚀
MIT License. Use freely with attribution.
