A simple AI agent that uses Model Context Protocol (MCP) to connect LLMs with external tools. This example uses Firecrawl for web scraping capabilities.
MCP is a protocol that lets AI models interact with external tools and services. Instead of hardcoding tool integrations, MCP provides a standard way for agents to discover and use tools dynamically.
-
Clone the repo
git clone https://github.com/yourusername/mcp-agent.git cd mcp-agent -
Install dependencies
uv sync
-
Set up environment variables
cp .env.example .env
Add your API keys to
.env:OPENAI_API_KEY=your_openai_key FIRECRAWL_API_KEY=your_firecrawl_key -
Run the agent
uv run main.py
You can use any OpenAI-compatible endpoint by modifying main.py:
# OpenAI (default)
model = ChatOpenAI(model="gpt-4.1-mini", temperature=0)
# Ollama (local)
model = ChatOpenAI(
model="llama3",
base_url="http://localhost:11434/v1",
api_key="not-needed"
)
# Groq
model = ChatOpenAI(
model="llama-3.1-70b",
base_url="https://api.groq.com/openai/v1",
api_key=os.getenv("GROQ_API_KEY")
)
# NVIDIA NIM
model = ChatOpenAI(
model="nvidia/nemotron-3-nano-30b-a3b",
base_url="https://integrate.api.nvidia.com/v1",
api_key=os.getenv("NVIDIA_API_KEY")
)mcp-agent/
├── main.py # Main agent code
├── pyproject.toml # Dependencies
├── .env # API keys (not committed)
└── README.md
MIT