Skip to content

Claude 4.5 Sonnet fails with 'temperature and top_p cannot both be specified' error #1125

@nicotem

Description

@nicotem

Since Opus 4.1 Anthropic has been enforcing its recommendation that temperature and top_p should not be both specified.
I can make Sonnet 3.7 work ok but Claude 4.* breaks no matter what I try.

I am using QC 3.7. I have been unable to configure config.ini in a way that will make Claude 4.* work, because the QC will default overwrite config.ini values for temperature and top_p that it does not like, and sends both, triggering the event.

Here below a more detailed breakdown...

Steps to Reproduce:

Configure Qualcoder with Anthropic API key in config.ini
Set up Claude 4.5 Sonnet model:

ini[ai_model_Anthropic Claude]
large_model = claude-sonnet-4-5-20250514
fast_model = claude-haiku-4-5-20251001
api_base = https://api.anthropic.com/v1/
api_key = [your-key]

Ensure config.ini has both temperature and top_p set in [DEFAULT]:

iniai_temperature = 1.0
ai_top_p = 1.0

4. Attempt to use any AI feature (AI coding, AI chat, etc.)

**Expected Behavior:**
AI features should work with Claude 4.5 models, just as they do with Claude 3.7.

**Actual Behavior:**
The following error occurs:

File "qualcoder/ai_async_worker.py", line 108, in run
File "qualcoder/ai_llm.py", line 589, in _ai_async_stream
File "langchain_core/language_models/chat_models.py", line 505, in stream
File "langchain_openai/chat_models/base.py", line 1044, in _stream
File "openai/_utils/_utils.py", line 287, in wrapper
File "openai/resources/chat/completions/completions.py", line 1087, in create
File "openai/_base_client.py", line 1256, in post
File "openai/_base_client.py", line 1044, in request

{'code': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.', 'type': 'invalid_request_error', 'param': None}

Error communicating with Anthropic Claude
BadRequestError: Error code: 400 - {'error': {'code': 'invalid_request_error', 'message': 'temperature and top_p cannot both be specified for this model. Please use only one.', 'type': 'invalid_request_error', 'param': None}}
Root Cause Analysis:
Anthropic introduced a breaking API change with Claude 4.1 Opus (August 2025) and Claude 4.5 Sonnet (September 2025) that prohibits specifying both temperature and top_p parameters simultaneously. Previous Claude models (3.7, 3.5, etc.) allowed both parameters, though Anthropic's documentation always recommended using only one.
This is a widespread issue affecting many applications: see n8n Issue #18304, LiteLLM Issue #15097, and LangChain.js Issue #9205.
Anthropic's official guidance now states: "You usually only need to use temperature" and designates top_p as "recommended for advanced use cases only." See: https://docs.anthropic.com/en/api/messages
Current Workaround:
Users can temporarily switch back to Claude 3.7 Sonnet which accepts both parameters:
inilarge_model = claude-3-7-sonnet-20250219
fast_model = claude-3-7-sonnet-20250219
Attempted User-Side Fixes (All Failed):

Commenting out or deleting ai_top_p line: Qualcoder's code regenerates the line in config.ini
Setting ai_top_p = None: Causes ValueError: could not convert string to float: 'None'
Leaving ai_top_p empty: Qualcoder writes ai_top_p = . which causes ValueError: could not convert string to float: '.'

The configuration file approach doesn't work because Qualcoder's code actively maintains these parameters.
Suggested Fix:
The issue is likely in qualcoder/ai_llm.py where the langchain ChatAnthropic client is initialized. The code probably looks something like:
pythonllm = ChatAnthropic(
model=model_name,
temperature=float(config['ai_temperature']),
top_p=float(config['ai_top_p']), # This causes the problem
...
)
Recommended solution: Check the model version and conditionally omit top_p for Claude 4.x models:
pythonllm_params = {
'model': model_name,
'temperature': float(config['ai_temperature']),
'api_key': api_key,
'base_url': base_url,
# ... other required params
}

Only add top_p for Claude models that support it ?
Claude 4.x models (4.1 Opus, 4.5 Sonnet, 4.5 Haiku) reject dual parameters
model_lower = model_name.lower()
if not any(x in model_lower for x in ['claude-4', 'claude-sonnet-4', 'claude-haiku-4', 'claude-opus-4']):
if config.get('ai_top_p') and config['ai_top_p'] not in [None, '', '.']:
try:
llm_params['top_p'] = float(config['ai_top_p'])
except ValueError:
pass # Skip invalid top_p values

llm = ChatAnthropic(**llm_params)
Alternative simpler fix: Just remove top_p entirely from the Anthropic integration. According to Anthropic's documentation, temperature alone is sufficient for most use cases, and top_p is only for advanced scenarios. Since Qualcoder's AI features work well with temperature alone, omitting top_p wouldn't impact functionality:
pythonllm = ChatAnthropic(
model=model_name,
temperature=float(config['ai_temperature']),
# Remove top_p parameter entirely for Anthropic models
...
)
Additional Context:

This only affects Anthropic Claude models; OpenAI and other providers continue to accept both parameters
The latest langchain-anthropic library (v0.3+) has updated defaults for Claude 4.x models where top_p defaults to null instead of -1, but Qualcoder may need to update dependencies or handle this explicitly
Anthropic's reasoning for the restriction is that combining temperature and top_p creates unpredictable sampling behavior

Impact:
This prevents users from accessing Claude 4.5 Sonnet, which is Anthropic's most capable model as of October 2025. It significantly impacts the quality of AI-assisted qualitative coding features.
Thank you for considering this fix! The Qualcoder AI integration is fantastic, and Claude 4.5 support would make it even better for qualitative researchers.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions