Skip to content

o1/o3-mini models fail with temperature parameter not supported error #1

@Ziyann

Description

@Ziyann

When attempting to use the o1/o3-mini models with the plugin, requests fail with a 400 error due to the temperature parameter being unsupported.

Steps to reproduce:

  1. Configure the plugin with gptModel = o1 or gptModel = o3-mini
  2. Submit a code review request
  3. Request fails with HTTP 400 (see error below)

Error response:

{
  "error": {
    "message": "Unsupported parameter: 'temperature' is not supported with this model.",
    "type": "invalid_request_error",
    "param": "temperature",
    "code": "unsupported_parameter"
  }
}

Workaround: Removing the temperature parameter from the request allows the o1/o3-mini models to work properly.

I've done some quick testing in stateless mode after implementing the workaround, and the o1/o3-mini models seems to perform pretty well for code reviews - noticeably better than 4o in my limited testing.

Also, exposing the reasoning_effort parameter might be a good idea for these models.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions