Skip to content

krampus-nuggets/panopticon

Repository files navigation

panopticon-header

PANOPTICON

🚧 This project is in its early stages and is currently under active development 🚧

This project is a modular bricolage, purposefully integrating existing tools to expose a local codebase to a suite of language models. The strategy prioritizes cost efficiency by handling routine queries locally, reserving cloud-based models like Claude for high-complexity tasks. A significant advantage is the resulting 'always-live' documentation, which remains synchronized and queriable as the code evolves.

1. Tools Utilized

2. Getting Started

👨🏾‍🔧 It is assumed that you already have UV, Python, Ollama, VS Code IDE (or equivalent) & Continue Extension installed. If not, please install them before proceeding. 👩🏾‍🔧

2.1 Install the dependencies - uv sync

2.2 Pull required models:

  • Ensure ollama is running
  • ollama pull deepseek-coder-v2:lite
  • ollama pull nomic-embed-text

2.3 Replace the placeholder - "your/path/to/main.py" - with your PATH in the Continue MCP-Server config file here - .continue\mcpServers\pano.yaml

2.4 Ensure that your Continue local-config contents match the - continue_local_example.yaml - config

panopticon-continue-local-config Example - Cursor IDE

2.5 Index the codebase - uv run main.py index

2.6 You should now be able to see your MCP Server and Query your codebase: (see below)

2.6.a MCP Connection 2.6.b Ollama Models
panopticon-mcp-server panopticon-mcp-server
2.6.c Successful Query to Codebase
panopticon-mcp-server

3. In-Progress Features

🟧 Switch to a file-change based indexing strategy
🟧 Improve usability to allow drop-in setup for any project
🟧 Evaluate other models for performance & usability (for example - qwen3-embedding:0.6b with qwen3.5:9b)

About

Integrate local LLMs into your codebase. Save on tokens/cost + Always up-to-date documentation.

Topics

Resources

Stars

Watchers

Forks

Contributors

Languages