Skip to content

it fails on the [completions] or [chat/completions] for JSONDecodeError #10

@Davidcake

Description

@Davidcake

after run [ litellm --model ollama/xxx], it works fine on the swagger URL.
and it works well on [GET /models].
but it fails on the [completions] or [chat/completions]. it

{
"error": {
"message": "Expecting value: line 1 column 1 (char 0)\n\nTraceback (most recent call last):\n File "/home/ubuntu/.local/lib/python3.8/site-packages/litellm/proxy/proxy_server.py", line 1456, in completion\n data = json.loads(body_str)\n File "/usr/lib/python3.8/json/init.py", line 357, in loads\n return _default_decoder.decode(s)\n File "/usr/lib/python3.8/json/decoder.py", line 337, in decode\n obj, end = self.raw_decode(s, idx=_w(s, 0).end())\n File "/usr/lib/python3.8/json/decoder.py", line 355, in raw_decode\n raise JSONDecodeError("Expecting value", s, err.value) from None\njson.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)\n",
"type": "None",
"param": "None",
"code": 500
}
}

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions