Skip to content

Bug: Cache fails on new version of OpenAI API #5

@devjeetr

Description

@devjeetr

The code for persisting the response cache fails in two places:

  1. json.dump(config.codex_query_response_log, f)
  2. json.dump(config.codex_query_response_log, f)

This happens because the response object returned by OpenAI is a BaseModel from pydantic which isn't automatically JSON serializable.

It's unclear if a specific version of the OpenAI client library is expected beyond what's specified in requirements.txt. I've tested this with openai==1.78.1. But I believe this failure will occur for all recent versions of the openai client that return responses as pydantic models.

Possible Fix

  1. Update response => response.model_dump() if it is a pydantic object in
    v = (k, response, current_time)
  2. Parse response into ChatCompletion in
    resp = config.codex_query_response_log[str(k)][1]

    This is needed because call sites expect the response to be a structured object.

I have a PR implementing this that I will attach here.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions