Primary Reference: See AI_INSTRUCTIONS.md for comprehensive AI agent guidelines.
This document contains agent-specific rules and overrides for the VibeRN project.
This section details configurations and specific rules for the Python-based agent framework that powers backend services.
- LLM Registration: Language models are treated as classes (e.g.,
class Claude(BaseLlm)) and must be registered with theLLMRegistryto be discoverable by the system. - Settings Management: Application-wide settings are managed via a database-backed
Settingmodel and can be updated throughsave_all_settings()orset_setting()functions. Settings are prefixed (e.g.,llm.,search.,app.).
The Claude agent integration has two primary implementations: a direct API client and a Vertex AI client.
- Configuration & API Keys:
- Direct API: The
Claude_API_KEYenvironment variable or an API key passed during initialization is required. - Vertex AI: The
GOOGLE_CLOUD_PROJECTandGOOGLE_CLOUD_LOCATIONenvironment variables must be set.
- Direct API: The
- Model Specification:
- A model name must be provided during initialization (e.g.,
claude-3-5-sonnet-v2@20241022). - Supported models are validated against regex patterns like
r"claude-3-.*"andr"claude-.*-4.*".
- A model name must be provided during initialization (e.g.,
- Prompting & Roles:
- Prompts are constructed using a strict conversational format, alternating between
Human:andAssistant:. The final prompt must end withAssistant:. - Internal application roles are mapped to either
"user"or"assistant"to fit this format.
- Prompts are constructed using a strict conversational format, alternating between
- Tool Usage: The Vertex AI implementation (
class Claude(BaseLlm)) supports function calling (tools). Tools are passed to the model via thetoolsandtool_choiceparameters.
The Gemini agent adheres to the general framework principles. The .gemini/rules.md file serves as a foundational guide for Gemini's interaction patterns, reinforcing the core mandates and project-specific guidelines.