Note: For most users, the Web UI Settings is the recommended way to configure Local Deep Research. Environment variables are primarily useful for Docker deployments, CI/CD pipelines, and server configurations where the web UI is not accessible during startup.
You can override any configuration setting in Local Deep Research using environment variables. This is useful for:
- Setting up multiple environments (development, production)
- Changing settings without modifying configuration files
- Providing sensitive information like API keys securely
- Setting server ports for Docker or cloud deployments
Local Deep Research uses Dynaconf to manage configuration. The format for environment variables is:
LDR_SECTION__SETTING=value
Note the double underscore (__) between the section and setting name.
The easiest way to configure settings is to create a .env file in your config directory:
Config Directory Locations:
- Windows:
%USERPROFILE%\Documents\LearningCircuit\local-deep-research\config\.env - Linux/Mac:
~/.config/local_deep_research/config/.env
Simply create a text file named .env in this directory and add your settings:
# Example .env file contents
LDR_WEB__PORT=8080
LDR_SEARCH__TOOL=wikipedia
LDR_GENERAL__ENABLE_FACT_CHECKING=true
# API keys (see important note below)
OPENAI_API_KEY=your-key-here
LDR_OPENAI_API_KEY=your-key-here
This file is automatically loaded when Local Deep Research starts, and any settings specified here will override those in the main configuration files.
Known Bug: Currently, API keys must be set both with and without the LDR_ prefix for search engines to work properly:
# You need BOTH of these for each API key
export OPENAI_API_KEY=your-key-here
export LDR_OPENAI_API_KEY=your-key-hereThis applies to all search-related API keys including:
OPENAI_API_KEYANTHROPIC_API_KEYSERP_API_KEYBRAVE_API_KEYGOOGLE_PSE_API_KEYGOOGLE_PSE_ENGINE_IDGUARDIAN_API_KEY
This issue will be fixed in a future update.
| Config in settings.toml | Environment Variable | Example |
|---|---|---|
[web] port = 5000 |
LDR_WEB__PORT |
LDR_WEB__PORT=8080 |
[search] tool = "auto" |
LDR_SEARCH__TOOL |
LDR_SEARCH__TOOL=wikipedia |
[general] enable_fact_checking = false |
LDR_GENERAL__ENABLE_FACT_CHECKING |
LDR_GENERAL__ENABLE_FACT_CHECKING=true |
API keys are best set using environment variables for security (remember the current requirement for both prefixed and non-prefixed versions):
# Set both versions for each API key
ANTHROPIC_API_KEY=your-api-key-here
LDR_ANTHROPIC_API_KEY=your-api-key-here
OPENAI_API_KEY=your-openai-key-here
LDR_OPENAI_API_KEY=your-openai-key-here
SERP_API_KEY=your-api-key-here
LDR_SERP_API_KEY=your-api-key-hereOpenRouter provides access to 100+ models through an OpenAI-compatible API. To use OpenRouter:
- Get an API key from openrouter.ai
- Configure using one of these methods:
Method 1: Via Web UI (Recommended)
- Navigate to Settings → LLM Provider
- Select "Custom OpenAI Endpoint"
- Set Endpoint URL to:
https://openrouter.ai/api/v1 - Enter your OpenRouter API key
- Select your desired model
Method 2: Via Environment Variables
# Required environment variables for OpenRouter
export LDR_LLM_PROVIDER=openai_endpoint
export LDR_LLM_OPENAI_ENDPOINT_URL=https://openrouter.ai/api/v1
export LDR_LLM_OPENAI_ENDPOINT_API_KEY="<your-api-key>"
export LDR_LLM_MODEL=anthropic/claude-3.5-sonnet # or any OpenRouter modelMethod 3: Docker Compose
Add to your docker-compose.yml environment section:
services:
local-deep-research:
environment:
- LDR_LLM_PROVIDER=openai_endpoint
- LDR_LLM_OPENAI_ENDPOINT_URL=https://openrouter.ai/api/v1
- LDR_LLM_OPENAI_ENDPOINT_API_KEY=<your-api-key>
- LDR_LLM_MODEL=anthropic/claude-3.5-sonnetAvailable Models: Browse models at openrouter.ai/models
Note: OpenRouter uses the OpenAI-compatible API, so you select "Custom OpenAI Endpoint" as the provider and change the endpoint URL to OpenRouter's API.
The same configuration pattern works for any OpenAI-compatible API service:
# Generic pattern for OpenAI-compatible APIs
export LDR_LLM_PROVIDER=openai_endpoint
export LDR_LLM_OPENAI_ENDPOINT_URL=https://your-provider.com/v1
export LDR_LLM_OPENAI_ENDPOINT_API_KEY="<your-api-key>"
export LDR_LLM_MODEL="<your-model-name>"For Docker deployments, you can pass environment variables when starting containers:
docker run -p 8080:8080 \
-e LDR_WEB__PORT=8080 \
-e LDR_SEARCH__TOOL=wikipedia \
-e OPENAI_API_KEY=your-key-here \
-e LDR_OPENAI_API_KEY=your-key-here \
local-deep-researchexport LDR_WEB__PORT=8080 # Linux/Mac
set LDR_WEB__PORT=8080 # Windows# Linux/Mac
export ANTHROPIC_API_KEY=your-key-here
export LDR_ANTHROPIC_API_KEY=your-key-here
# Windows
set ANTHROPIC_API_KEY=your-key-here
set LDR_ANTHROPIC_API_KEY=your-key-hereexport LDR_SEARCH__TOOL=wikipedia # Linux/Mac
set LDR_SEARCH__TOOL=wikipedia # WindowsBy default, Local Deep Research stores all data (database, research outputs, cache, logs) in platform-specific user directories. You can override this location using the LDR_DATA_DIR environment variable:
# Linux/Mac
export LDR_DATA_DIR=/path/to/your/data/directory
# Windows
set LDR_DATA_DIR=C:\path\to\your\data\directoryAll application data will be organized under this directory:
$LDR_DATA_DIR/ldr.db- Application database$LDR_DATA_DIR/research_outputs/- Research reports$LDR_DATA_DIR/cache/- Cached data$LDR_DATA_DIR/logs/- Application logs