-
Notifications
You must be signed in to change notification settings - Fork 46
Open
Description
I have script like this:
import os
from pathlib import Path
from openai import OpenAI
from dotenv import load_dotenv
load_dotenv()
api_key = os.environ.get("LITELLM_API_KEY")
base_url = os.environ.get("LITELLM_BASE_URL")
# Create openai client using litellm-proxy
client = OpenAI(base_url=base_url, api_key=api_key)
# Upload file to gemini
pdf_path = Path("SOME_FILE.pdf")
with pdf_path.open("rb") as f:
pdf_file = client.files.create(
file=f,
purpose="user_data",
extra_body={"custom_llm_provider": "gemini"}
)
# Delete file from gemini
client.files.delete(
file_id=pdf_file.id,
extra_body={"custom_llm_provider": "gemini"}
)Upload works, but it fails to delete file with following error:
Traceback (most recent call last):
File "XXX/test_delete_file_litellm.py", line 23, in <module>
client.files.delete(
File "XXX/.venv/lib/python3.12/site-packages/openai/resources/files.py", line 247, in delete
return self._delete(
^^^^^^^^^^^^^
File "XXX/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1292, in delete
return self.request(cast_to, opts)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "XXX/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1044, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': "litellm.BadRequestError: LiteLLM doesn't support gemini for 'create_batch'. Only 'openai' is supported.", 'type': None, 'param': None, 'code': '400'}}
My litellm_config.yaml is:
model_list:
- model_name: "gemini-2.5-flash"
litellm_params:
model: gemini/gemini-2.5-flash
api_key: os.environ/GEMINI_API_KEY
files_settings:
- custom_llm_provider: gemini
api_key: os.environ/GEMINI_API_KEY
general_settings:
master_key: sk-1234
database_url: "postgres://postgres:postgres@pg:5432/litellm"Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels