Investigation into logs reveals a 500 error from the Continuum provider for gpt-oss-120b:
Provider continuum returned status 500 Internal Server Error: {"error":{"message":"Unexpected token 200002 while expecting start token 200006","type":"Internal Server Error","param":null,"code":500}}
Context:
- Occurred in
opensecret::web::openai.
- Model:
gpt-oss-120b.
- Timestamp:
2025-11-22T21:19:29.201254Z.
- This error caused the intent classification/query extraction to fail (falling back to original message) as seen in logs:
WARN opensecret::web::responses::handlers: Query extraction failed, using original message: InternalServerError.
This appears to be an upstream provider issue with tokenization or generation state.