secrux-ai is the FastAPI microservice used by Secrux for AI jobs, MCP/agent workflows, and (optional) review/enrichment features.
- Python 3.11+
- Recommended:
uv(optional, see[tool.uv]inpyproject.toml)
cd secrux-ai
python -m venv .venv
. .venv/bin/activate
pip install -e ".[dev]"
uvicorn service.main:app --host 0.0.0.0 --port 5156 --reloadHealth: http://localhost:5156/health
This starts the AI service + its Postgres in one compose project.
cd secrux-ai
cp .env.example .env
docker compose up -d
docker compose psCopy .env.example to .env and adjust as needed. Common variables:
SECRUX_AI_SERVICE_TOKEN(must match the server-side token)AI_DATABASE_URL(Postgres connection string)- Optional LLM:
SECRUX_AI_LLM_BASE_URL,SECRUX_AI_LLM_API_KEY,SECRUX_AI_LLM_MODEL - Optional prompt dump:
SECRUX_AI_PROMPT_DUMP,SECRUX_AI_PROMPT_DUMP_DIR
AI_PORT: Host port mapped to the AI service container:5156.
SECRUX_AI_SERVICE_TOKEN: Shared token used bysecrux-serverto call the AI service (treat as a password).
AI_POSTGRES_DB,AI_POSTGRES_USER,AI_POSTGRES_PASSWORD: Used by theai-postgrescontainer.AI_DATABASE_URL: SQLModel/psycopg connection string (example:postgresql+psycopg://user:pass@host:5432/db).
SECRUX_AI_LLM_BASE_URL,SECRUX_AI_LLM_API_KEY,SECRUX_AI_LLM_MODEL: Configure an upstream LLM provider; leave empty to disable live calls.
SECRUX_AI_PROMPT_DUMP:off|file|stdout.SECRUX_AI_PROMPT_DUMP_DIR: Directory for dump files when usingfile.- Additional filters and knobs are documented in
secrux-ai/.env.example.