An automated analytics reporting tool that leverages the Model Context Protocol (MCP) with Ollama, Cloudflare Workers, or Google Gemini to generate intelligent reports from Umami analytics data.
Blog post: https://www.rhelmer.org/blog/ai-powered-analytics-reports-using-mcp/
This project combines several powerful tools to create automated analytics reports:
- MCP Client: Direct use of the Model Context Protocol for orchestrating AI interactions
- Ollama/Cloudflare Workers/Google Gemini: LLM inference backends
- Umami MCP Server: Connects to your Umami analytics instance to fetch website data (included in this repo)
- Automated Reporting: Generates comprehensive analytics reports using AI
- π€ AI-Powered Analysis: Uses large language models to analyze website analytics data
- π Comprehensive Reports: Generates detailed insights from your Umami analytics
- π Flexible Backends: Choose between local Ollama, Cloudflare Workers, or Google Gemini
- π¬ Interactive Mode: Chat interface for exploring your analytics data
- π Easy Setup: Simple installation and configuration process
- π Zero-dependency MCP Server: The included Umami MCP server uses only Python standard library
- π UTM Tracking: Analyze campaign performance with UTM parameter support
- Python 3.8+
- Access to an Umami analytics instance (cloud or self-hosted)
- An AI provider:
- Ollama installed locally with a Llama model, OR
- A Cloudflare Workers account with AI access, OR
- A Google AI Studio API key for Gemini
-
Clone the repository
git clone <your-repo-url> cd <your-repo-name>
-
Install dependencies
pip install uv
-
Set up environment variables
cp .env.example .env
Edit
.envwith your configuration:# Umami Configuration (Self-hosted with Team) UMAMI_URL=https://your-umami-instance.com UMAMI_USERNAME=your_username UMAMI_PASSWORD=your_password UMAMI_TEAM_ID=your-team-id # Cloudflare AI Configuration CLOUDFLARE_ACCOUNT_ID=your-cloudflare-account-id CLOUDFLARE_API_TOKEN=your-cloudflare-api-token # Gemini API Configuration GEMINI_API_KEY=your-gemini-api-key
This project includes a zero-dependency Umami MCP server in the umami_mcp_server/ directory.
For Umami Cloud:
- Go to Settings β API Keys in your Umami Cloud dashboard
- Create an API key
- Add to your
.envfile:UMAMI_URL=https://api.umami.is UMAMI_API_KEY=your_api_key_here
For Self-hosted Umami:
- Use your login credentials
- Add to your
.envfile:UMAMI_URL=https://your-umami-instance.com UMAMI_USERNAME=admin UMAMI_PASSWORD=your_password UMAMI_TEAM_ID=your-team-id # Optional: for team-based access
The server auto-detects which mode to use based on which environment variables are set.
# Install Ollama
## Linux
curl -fsSL https://ollama.ai/install.sh | sh
## macOS
brew install ollama
# Start Ollama service
ollama serve
# Pull Llama model
ollama pull llama3.2- Sign up for Cloudflare Workers
- Enable AI features in your account
- Get your API token and account ID
- Add credentials to
.envfile
- Go to Google AI Studio.
- Create an API key.
- Add the
GEMINI_API_KEYto your.envfile. - Ensure you have Node.js and
npxinstalled to use thegemini-cliprovider. The script will fall back to the REST API if the CLI is not available.
You can specify the AI provider using the --ai-provider flag. Supported providers are cloudflare, ollama, and gemini-cli.
Start an interactive session to explore your analytics data:
# Using Gemini (recommended)
uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --chat --ai-provider gemini-cli
# Using Ollama
uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --chat --ai-provider ollama
# Using Cloudflare
uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --chat --ai-provider cloudflareExample interactions:
- "Show me a summary of last month's traffic"
- "What are my top pages this week?"
- "Generate a comprehensive monthly report"
- "Compare this month's performance to last month"
- "Which UTM campaigns drove the most traffic?"
- "What are my top referrers from social media?"
Generate specific reports directly:
# Custom date range with Gemini
uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --ai-provider gemini-cli
# Using Ollama
uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --ai-provider ollamaSet up automated report generation using cron:
# Add to crontab for weekly reports every Monday at 9 AM
0 9 * * 1 cd /path/to/project && uv run --with-requirements requirements.txt run.py --start-date 2025-01-01 --end-date 2025-12-31 --website example.com --ai-provider gemini-cli- Traffic Summary: Page views, unique visitors, sessions, bounce rate
- Top Content: Most popular pages and referrers
- Geographic Analysis: Visitor locations and demographics
- Device & Browser: Technology usage patterns
- UTM Campaign Tracking: Performance by source, medium, campaign, content, and term
- Performance Trends: Growth metrics and comparisons
- Custom Insights: AI-generated observations and recommendations
Connection Errors
# Check Umami API connectivity
curl -u user:pass https://your-umami-instance.com/api/auth/loginOllama Issues
# Verify Ollama is running
ollama list
ollama psMissing Website Data
- Ensure the user has access to the website in Umami admin
- For team-based access, set
UMAMI_TEAM_IDin your.envfile - Check that the website domain matches exactly (e.g.,
example.comnotwww.example.com)
βββ run.py # Main application entry point
βββ requirements.txt # Python dependencies
βββ pyproject.toml # MCP server package config
βββ umami_mcp_server/ # Zero-dependency Umami MCP server
β βββ __init__.py
β βββ __main__.py
β βββ server.py # MCP server implementation
β βββ umami_client.py # Umami API client
β βββ README.md # MCP server documentation
βββ .env.example # Environment variables template
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
mcp: Model Context Protocol SDKpython-dotenv: Environment variable managementaiohttp: Async HTTP client for AI providers
For issues and questions:
- Create an issue in this repository
- Review MCP Python SDK documentation
- Check Umami API documentation
- Umami for the analytics platform
- Ollama for local LLM inference
- Cloudflare for cloud AI services
- Google Gemini for the Gemini API
- Model Context Protocol for the MCP specification