Open-source PowerPoint generation engine for AI agents.
Give it a prompt, optional source files, and an LLM key — get back a validated .pptx.
# 1. Install
npm install && python -m pip install -r requirements.txt
# 2. Configure (writes .env with your LLM key)
./auto-ppt init
# 3. Generate — mock mode, no LLM key needed
./auto-ppt generate --mock \
--prompt "Create an 8-slide AI strategy deck for executives" \
--source examples/inputs/sample-source-brief.md
# 4. Generate — real model, after init
./auto-ppt generate \
--prompt "Create an 8-slide AI strategy deck for executives" \
--source examples/inputs/sample-source-brief.md
# 5. Revise
./auto-ppt revise \
--deck output/py-generated-deck.json \
--prompt "Compress to 6 slides, make it more conclusion-driven"Output: output/py-generated-deck.json + .pptx, output/py-revised-deck.json + .pptx.
| Capability | Detail |
|---|---|
| Prompt-to-deck planning | Natural-language prompt → structured slide deck |
| Natural-language revision | Iterate on an existing deck with free-text instructions |
| Source ingestion | .txt .md .csv .json .yaml .xml .html .pdf .docx, images, URLs |
| Schema validation | JSON-schema check before every render |
| LLM providers | OpenAI, OpenRouter, Claude, Gemini, Qwen, DeepSeek, GLM, MiniMax, any OpenAI-compatible endpoint |
| PPTX rendering | JS renderer (pptxgenjs) or Python renderer (python-pptx with brand templates) |
| Security | Path traversal protection, SSRF blocking, file size limits, subprocess timeout |
| Interface | Command | Use Case |
|---|---|---|
| CLI | ./auto-ppt generate / revise |
Interactive or scripted usage |
| MCP | python mcp_server.py |
Claude Desktop, Cursor, Windsurf |
| HTTP | python py-skill-server.py |
REST integration (POST /skill) |
| JSON skill | python py-agent-skill.py --request req.json |
File-based agent orchestration |
| Docker | docker compose up --build |
One-command deploy |
Claude Desktop — add to claude_desktop_config.json:
{
"mcpServers": {
"auto-ppt": {
"command": "python",
"args": ["/absolute/path/to/auto-ppt-prototype/mcp_server.py"]
}
}
}Cursor / Windsurf — add to .cursor/mcp.json or .windsurf/mcp.json:
{
"mcpServers": {
"auto-ppt": {
"command": "python",
"args": ["/absolute/path/to/auto-ppt-prototype/mcp_server.py"]
}
}
}Tools exposed: create_deck, revise_deck. Both accept sources, mock mode, and optional output_dir.
export OPENAI_API_KEY="sk-..."
docker compose up --build # HTTP skill server
docker run --rm -it -e OPENAI_API_KEY auto-ppt-prototype python mcp_server.py # MCP stdiopython -m pytest tests/ -v # 300+ tests
npm run ci:smoke # JS renderer + end-to-end smokeCI: pytest on Python 3.10 / 3.11 / 3.12, smoke on Node.js 18 / 20 / 22.
prompt + sources ➜ Python smart layer ➜ deck JSON ➜ PPTX renderer ➜ .pptx
↑ revise loop ↲
- Python (
python_backend/): planning, revision, source loading, LLM calls, schema validation - Node (
generate-ppt.js): pptxgenjs rendering from validated deck JSON - deck JSON: the stable contract between both layers
| Doc | Content |
|---|---|
| Examples | Copy-paste usage flows |
| User Guide | Day-to-day usage |
| Integration Guide | HTTP, MCP, JSON skill patterns |
| Product Overview | Positioning and boundaries |
| Repository Map | File-by-file structure |
| Changelog | Version history |
| Roadmap | Planned evolution |
Multilingual: all guides available in English, 中文, 日本語.
MIT