A starter template for wrapping your repos in an AI-powered workspace with knowledge graphs and LLM-compiled wikis. Works with Claude Code, GitHub Copilot, or any AI coding assistant.
Clone this, add your repos, and get:
- Knowledge graphs — Turn any codebase into a queryable graph (AST + AI extraction)
- LLM wiki — Compiled knowledge base across all your repos
- Automatic bridging — graph output feeds into wiki articles
# Clone the workspace kit
git clone https://github.com/chitinhq/workspace-kit.git my-workspace
cd my-workspace
# Run setup (installs graphify, detects your platform)
bash setup.sh
# Add your repos
git clone <your-repo-url> repos/my-app
# Build a knowledge graph
cd repos/my-app
graphify # AST extraction → graph.json + GRAPH_REPORT.md
# Create a wiki
cd ../..
bash scripts/wiki-init.sh . my-knowledge
bash scripts/wiki-ingest.sh . repos/my-app/graphify-out/GRAPH_REPORT.mdClaude Code — slash commands work directly:
/graphify repos/my-app
/wiki compile
GitHub Copilot — run graphify from terminal, Copilot reads the output via .github/copilot-instructions.md:
cd repos/my-app && graphify # generates graphify-out/GRAPH_REPORT.md
# Copilot now has structural context about your codebaseworkspace-kit/
├── CLAUDE.md # workspace instructions + skill docs
├── setup.sh # one-time setup script
├── claude/skills/
│ ├── graphify.md # /graphify skill
│ └── wiki.md # /wiki skill
├── scripts/
│ ├── sync-skills.sh # wire skills into Claude Code
│ ├── wiki-init.sh # create wiki workspace
│ ├── wiki-ingest.sh # add sources to wiki
│ ├── wiki-compile.sh # compile wiki articles
│ ├── wiki-normalize.sh # normalize source format
│ ├── wiki-link.sh # cross-link articles
│ ├── wiki-index.sh # build wiki index
│ ├── wiki-lint.sh # lint wiki quality
│ └── octi-knowledge-sync.py # push to Octi memory (optional)
└── repos/ # your repos go here
Turn code, docs, papers, images into a navigable knowledge graph.
- AST extraction (free, deterministic) via tree-sitter — supports 19 languages including TypeScript, Python, Go, Rust, Java
- Semantic extraction (AI-powered) via parallel Claude subagents for docs, papers, images
- Community detection via Leiden clustering
- Wiki bridge — auto-generates wiki articles from graph communities
Great for NX monorepos, microservice architectures, or any codebase you want to understand fast.
LLM-compiled knowledge bases from raw sources.
- Ingest anything: code, docs, URLs, PDFs, notes
- Incremental compilation — only reprocesses changed sources
- 6 lint passes for quality detection
- Interactive Q&A against the wiki
Edit CLAUDE.md to add project-specific instructions. Add more skills to claude/skills/. Run scripts/sync-skills.sh after changes.
If you run Octi Pulpo for agent orchestration, the knowledge sync script pushes graphs and wiki articles to Octi's shared memory:
python3 scripts/octi-knowledge-sync.py --graph graphify-out --wiki wiki --repo my-appAny dispatched agent can then memory_recall structural context about your codebase.