Ever found yourself in a situation where your coding angel (or agent) stares blankly at you when you ask about the latest AI library? 🤔 That's because they're still catching up with the training data from 2023!
This MCP (Model Context Protocol) tool is your secret weapon against outdated knowledge. It enables semantic search across multiple AI library documentations, ensuring your coding companion stays up-to-date with the latest tech. No more "I'm sorry, I don't have information about that" moments!
Simply configure your favorite libraries in the config file, and let your coding angel do the heavy lifting of finding the exact information you need from the official docs. It's like giving your AI assistant a direct line to the source of truth! 🚀
Special thanks to Alejandro AO for his wonderful tutorial on creating MCP servers. This project was inspired by his work and uses his implementation patterns.
- 🔍 Search across multiple AI library documentations:
- LangChain - A framework for developing applications powered by language models
- LangGraph - A library for building complex AI workflows
- CrewAI - A framework for orchestrating role-playing, autonomous AI agents
- LlamaIndex - A data framework for LLM applications
- OpenAI - Official documentation for OpenAI's API and models
- ⚡ Fast and efficient search using Serper API
- 🎯 Accurate results with semantic search capabilities
- 🔄 Real-time documentation fetching
- 🛠️ Easy integration with MCP-based applications
- ⚙️ Easy configuration for adding new documentation sources
- Python 3.12 or higher
- Serper API key (for web search functionality)
- MCP SDK 1.2.0 or higher
- Clone the repository:
git clone https://github.com/mostafa-ghaith/search-docs-mcp.git
cd search-docs-mcp- Create and activate a virtual environment:
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate- Install dependencies:
pip install -e .- Create a
.envfile in the project root and add your Serper API key:
SERPER_API_KEY=your_api_key_here
The tool uses a configuration file (config.py) to manage documentation sources. You can easily add new documentation sources by editing this file:
DOCS_CONFIG = {
"new_library": {
"url": "docs.new-library.com",
"description": "Description of the new library"
}
}The tool can be used as part of an MCP-based application. Here's an example of how to use it:
from mcp.server.fastmcp import FastMCP
mcp = FastMCP("docs")
# The tool will be available as part of your MCP application
# You can search documentation like this:
result = await mcp.get_docs(query="Chroma DB", library="langchain")-
For Claude Desktop:
- Edit
~/Library/Application Support/Claude/claude_desktop_config.json:
{ "mcpServers": { "search-docs-mcp": { "command": "uv", "args": [ "--directory", "/ABSOLUTE/PATH/TO/YOUR/search-docs-mcp", "run", "main.py" ] } } } - Edit
-
For Cursor:
- Navigate to Cursor Settings
- Open the MCP tab
- Click on "Add new global MCP server"
- Add the server configuration similar to Claude Desktop
-
Restart the application to apply changes
Search documentation for a specific query in a given library.
Parameters:
query(str): The search query (e.g., "Chroma DB")library(str): The library to search in (see config.py for supported libraries)
Returns:
- Text content from the relevant documentation pages
Contributions are welcome! Please feel free to submit a Pull Request. When adding new documentation sources, please update the config.py file with the appropriate URL and description.
This project is licensed under the MIT License - see the LICENSE file for details.
- Alejandro AO for the MCP server tutorial and implementation patterns
- MCP for the framework
- Serper for the search API
- All the documentation providers for their valuable content