Experimental playground for building LLM-powered multi-agent systems with LangChain. Each script demonstrates a different agentic architecture pattern — from simple hierarchical delegation to concurrent fan-out with real external tools.
This example illustrates a minimal hierarchical multi-agent architecture in which a central “orchestrator” agent coordinates specialised sub-agents to solve user requests. When a user submits a question, the main agent interprets it and determines which specialist agent is best suited to handle the task—for instance, routing a square-calculation request to a squaring agent or a root calculation to a square-root agent. The selected specialist then executes its own internal decision loop, using its dedicated tool to perform the computation and generate a response, which is returned to the orchestrator and passed back to the user. This design highlights key principles of scalable agent systems: each agent is narrowly specialised and only exposed to the tools it requires, the orchestrator focuses solely on delegation rather than task execution, and new capabilities can be added simply by introducing additional specialist agents without modifying the core orchestration logic.
The program demonstrates a parallel “fan-out” agentic architecture in which three specialised AI agents independently handle different components of a wedding-planning task at the same time. A travel agent accesses live flight data via external APIs, a venue agent performs web research, and a playlist agent queries a database for music selections. These agents run concurrently rather than sequentially, allowing faster and more efficient task completion. Each agent is given a defined role through its system prompt, which constrains its behaviour and expertise, and each is equipped with external tools that extend its capabilities beyond what a language model can do on its own. The system also illustrates how the Model Context Protocol (MCP) standardises communication between agents and external services, and how agents can be designed to self-correct—for example, retrying failed database queries—within an iterative decision loop. Together, this provides a clear demonstration of how modern agentic AI systems coordinate specialised components to solve multi-step, real-world problems efficiently.
.
├── src/
│ ├── __init__.py
│ └── sub_agent.py # Reusable SubAgent wrapper (invoke, ainvoke, as_tool)
├── scripts/
│ ├── square_agent.py # Hierarchical agent pattern (orchestrator → specialists)
│ ├── wedding_planner.py # Parallel fan-out pattern (3 concurrent specialists)
│ └── chinook_explorer.py # Standalone helper to browse the Chinook sample DB
├── resources/
│ └── Chinook.db # SQLite sample database (music store: artists, albums, tracks)
├── tests/
├── pyproject.toml
└── CLAUDE.md
A parent orchestrator agent decides which specialist sub-agent to call based on the user's question. Sub-agents are exposed to the parent via SubAgent.as_tool().
User → main_agent (orchestrator)
├── sqrt_agent → square_root tool
└── sq_agent → square tool
Three independent specialist agents run concurrently via asyncio.gather(). Each has its own toolset — no shared state, no inter-agent communication.
main()
├── travel_agent → Kiwi flight search (MCP)
├── venue_agent → Tavily web search
└── playlist_agent → SQL queries against Chinook.db (LangChain SQLDatabase)
Wraps a LangChain agent with a persona (system prompt) and tools into a reusable building block. Key methods:
| Method | Purpose |
|---|---|
invoke(prompt) |
Synchronous execution — runs the agentic loop and returns the final answer |
ainvoke(prompt) |
Async version for concurrent execution with asyncio.gather() |
as_tool() |
Wraps the sub-agent as a LangChain Tool so a parent agent can call it |
- Python 3.10+
- API keys for: OpenAI, Tavily, and optionally Anthropic, Google, LangSmith
python -m venv .venv
source .venv/bin/activate
pip install -e .Some dependencies not listed in pyproject.toml are required by individual scripts:
pip install tavily-python langchain-community langchain-mcp-adaptersCreate a .env file in the project root:
OPENAI_API_KEY=sk-...
TAVILY_API_KEY=tvly-...
# Optional
ANTHROPIC_API_KEY=sk-ant-...
GOOGLE_API_KEY=...
LANGSMITH_API_KEY=...# Hierarchical agent demo (sync, no external APIs beyond OpenAI)
python scripts/square_agent.py
# Wedding planner demo (async, requires OpenAI + Tavily + Kiwi MCP)
python scripts/wedding_planner.py
# Browse the Chinook database interactively (no API keys needed)
python scripts/chinook_explorer.pyresources/Chinook.db is a sample SQLite database representing a digital music store. It is used by the wedding planner's playlist agent to build playlists via SQL.
| Table | Rows | Description |
|---|---|---|
| Artist | 275 | Band / performer names |
| Album | 347 | Albums linked to artists |
| Track | 3503 | Individual songs with duration, price, composer |
| Genre | 25 | Rock, Jazz, Latin, Metal, etc. |
| Playlist | 18 | Named playlists |
| PlaylistTrack | 8715 | Many-to-many mapping of playlists ↔ tracks |
| Customer | 59 | Customers with contact and billing info |
| Invoice | 412 | Purchase records |
| InvoiceLine | 2240 | Line items per invoice |
| Employee | 8 | Store staff |
| MediaType | 5 | Audio formats (MPEG, AAC, etc.) |
scripts/chinook_explorer.py provides a ChinookExplorer class for quick schema inspection, sample data, and canned queries (top artists, genres, customers) — useful for understanding what the playlist agent has to work with.
- Define tools as
@tool-decorated functions. - Create a
SubAgentwith a descriptive name, system prompt, and tools. - Either call it directly (
invoke/ainvoke) or expose it to a parent agent viaas_tool(). - Add a new script under
scripts/.