forked from xielong/chatgpt-code-review-gerrit-plugin
-
Notifications
You must be signed in to change notification settings - Fork 13
Open
Description
When attempting to use the o1/o3-mini models with the plugin, requests fail with a 400 error due to the temperature parameter being unsupported.
Steps to reproduce:
- Configure the plugin with
gptModel = o1orgptModel = o3-mini - Submit a code review request
- Request fails with HTTP 400 (see error below)
Error response:
{
"error": {
"message": "Unsupported parameter: 'temperature' is not supported with this model.",
"type": "invalid_request_error",
"param": "temperature",
"code": "unsupported_parameter"
}
}
Workaround: Removing the temperature parameter from the request allows the o1/o3-mini models to work properly.
I've done some quick testing in stateless mode after implementing the workaround, and the o1/o3-mini models seems to perform pretty well for code reviews - noticeably better than 4o in my limited testing.
Also, exposing the reasoning_effort parameter might be a good idea for these models.
Metadata
Metadata
Assignees
Labels
No labels