Skip to content

[BUG] max_tokens parameter error when evaluating with OpenAI models #32

@JoonYong-Park

Description

@JoonYong-Park

Describe the bug
Evaluation runs fine with gpt-4o-mini, but fails with a 400 Bad Request when using the new gpt-5-nano-2025-08-07 model.

Root Cause
According to the official documentation, the max_tokens parameter is now deprecated and incompatible with newer models (like the o-series and gpt-5). The API now requires max_completion_tokens instead.

Error Log

httpx.HTTPStatusError: Client error '400 Bad Request' for url '[https://api.openai.com/v1/chat/completions](https://api.openai.com/v1/chat/completions)'
For more information check: [https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400)
ERROR - openai_backend - OpenAI attempt 3/3 failed: Error code: 400 - {'error': {'message': "Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.", 'type': 'invalid_request_error', 'param': 'max_tokens', 'code': 'unsupported_parameter'}}

Official Reference

Image

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions