Skip to content

Add multi-provider LLM support and settings#31

Draft
mathewab wants to merge 1 commit intomainfrom
llm-multi-provider-settings
Draft

Add multi-provider LLM support and settings#31
mathewab wants to merge 1 commit intomainfrom
llm-multi-provider-settings

Conversation

@mathewab
Copy link
Copy Markdown
Owner

@mathewab mathewab commented Jan 5, 2026

This pull request introduces major enhancements to support multiple LLM providers (OpenAI, Anthropic, Google Gemini, Ollama, and Vercel AI Gateway) using a provider-agnostic architecture. It also adds a new persistent app config store, updates environment and configuration handling, and removes the legacy OpenAIAdapter in favor of a unified interface. These changes make the system more flexible, extensible, and easier to configure for different AI backends.

Key changes:

Multi-provider LLM support and configuration

  • Added support for multiple LLM providers (OpenAI, Anthropic, Google, Ollama, Gateway) via the Vercel AI SDK, replacing the previous OpenAI-only integration. Environment variables, configuration schemas, and documentation have been updated to reflect this change. (.env.example, README.md, package.json, src/infra/env.ts, [1] [2] [3] [4] [5] [6] [7]

  • Introduced a new provider-agnostic LLM adapter interface (LLMAdapter) with support for schema-based JSON outputs, runtime provider selection, and capability introspection. (src/infra/llm/LLMAdapter.ts, src/infra/llm/LLMAdapter.tsR1-R34)

  • Removed the legacy OpenAIAdapter and replaced all usage with the new unified LLM adapter approach. (src/infra/OpenAIAdapter.ts, src/infra/OpenAIAdapter.tsL1-L237)

App configuration persistence

  • Added a new app_config database table and migration to persist runtime settings (e.g., LLM provider/model) in a key-value format. (src/infra/db/migrations/20260104_add_app_config.cjs, [1]; src/infra/db/schema.sql, [2]

API and domain changes

  • Refactored the /config API endpoints to support dynamic LLM provider/model configuration using a new LLMConfigService, with validation and error handling. (src/api/configRoutes.ts, [1]; src/api/index.ts, [2] [3] [4]

  • Updated domain error handling to generalize OpenAI errors as LLM errors and expanded audit event types to include LLM provider changes and failures. (src/domain/errors.ts, [1]; src/domain/entities/AuditEntry.ts, [2]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant