Skip to content

Using Google Gemma Models Calls OpenAi Tool Use #13

@MFA-X-AI

Description

@MFA-X-AI

Hi! I noticed this interesting behavior for using Gemma models with Xaibo.
Using the default setup with OpenRouter, I'm seeing it try to use OpenAi tools instead of using the ones from Xaibo.
The ones I've tried are:

  • google/gemma-3n-e4b-it
  • google/gemma-3-4b-it
  • google/gemma-2-9b-it

Image

Error generating response from OpenAI: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Traceback (most recent call last):
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/llm/openai.py", line 215, in generate
    response: ChatCompletion = await self.client.chat.completions.create(**kwargs)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1784, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1584, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Traceback (most recent call last):
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/orchestrator/stressing_tool_user.py", line 93, in handle_text
    llm_response = await self.llm.generate(conversation, options)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/llm/openai.py", line 215, in generate
    response: ChatCompletion = await self.client.chat.completions.create(**kwargs)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1784, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1584, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Error handling non-streaming text request: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Traceback (most recent call last):
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/openai.py", line 218, in handle_non_streaming_request
    response = await agent.handle_text(last_user_message, entry_point=entry_point)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/agent.py", line 46, in handle_text
    await entry_module.handle_text(text)
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/orchestrator/stressing_tool_user.py", line 93, in handle_text
    llm_response = await self.llm.generate(conversation, options)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/llm/openai.py", line 215, in generate
    response: ChatCompletion = await self.client.chat.completions.create(**kwargs)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1784, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1584, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Unexpected error in completion_request: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
Traceback (most recent call last):
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/openai.py", line 102, in completion_request
    return await self.handle_non_streaming_request(data, last_user_message, conversation_id, conversation)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/openai.py", line 218, in handle_non_streaming_request
    response = await agent.handle_text(last_user_message, entry_point=entry_point)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/agent.py", line 46, in handle_text
    await entry_module.handle_text(text)
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/orchestrator/stressing_tool_user.py", line 93, in handle_text
    llm_response = await self.llm.generate(conversation, options)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/llm/openai.py", line 215, in generate
    response: ChatCompletion = await self.client.chat.completions.create(**kwargs)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1784, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1584, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}
INFO:     127.0.0.1:57424 - "POST /openai/chat/completions HTTP/1.1" 500 Internal Server Error
ERROR:    Exception in ASGI application
Traceback (most recent call last):
  File "my-agent/.venv/lib/python3.12/site-packages/uvicorn/protocols/http/httptools_impl.py", line 409, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/uvicorn/middleware/proxy_headers.py", line 60, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/fastapi/applications.py", line 1054, in __call__
    await super().__call__(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/applications.py", line 112, in __call__
    await self.middleware_stack(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 187, in __call__
    raise exc
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/errors.py", line 165, in __call__
    await self.app(scope, receive, _send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 176, in __call__
    with recv_stream, send_stream, collapse_excgroups():
  File "/usr/lib/python3.12/contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/_utils.py", line 82, in collapse_excgroups
    raise exc
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 178, in __call__
    response = await self.dispatch_func(request, call_next)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/ui.py", line 133, in spa_fallback_middleware
    response = await call_next(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 156, in call_next
    raise app_exc
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/base.py", line 141, in coro
    await self.app(scope, receive_or_disconnect, send_no_error)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/cors.py", line 85, in __call__
    await self.app(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/middleware/exceptions.py", line 62, in __call__
    await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/routing.py", line 714, in __call__
    await self.middleware_stack(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/routing.py", line 734, in app
    await route.handle(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/routing.py", line 288, in handle
    await self.app(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/routing.py", line 76, in app
    await wrap_app_handling_exceptions(app, request)(scope, receive, send)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
    raise exc
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/_exception_handler.py", line 42, in wrapped_app
    await app(scope, receive, sender)
  File "my-agent/.venv/lib/python3.12/site-packages/starlette/routing.py", line 73, in app
    response = await f(request)
               ^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 301, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/fastapi/routing.py", line 212, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/openai.py", line 102, in completion_request
    return await self.handle_non_streaming_request(data, last_user_message, conversation_id, conversation)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/server/adapters/openai.py", line 218, in handle_non_streaming_request
    response = await agent.handle_text(last_user_message, entry_point=entry_point)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/agent.py", line 46, in handle_text
    await entry_module.handle_text(text)
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/orchestrator/stressing_tool_user.py", line 93, in handle_text
    llm_response = await self.llm.generate(conversation, options)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/core/exchange.py", line 281, in __call__
    result = await self._method(*args, **kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/xaibo/primitives/modules/llm/openai.py", line 215, in generate
    response: ChatCompletion = await self.client.chat.completions.create(**kwargs)
                               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 2028, in create
    return await self._post(
           ^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1784, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "my-agent/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1584, in request
    raise self._make_status_error_from_response(err.response) from None
openai.NotFoundError: Error code: 404 - {'error': {'message': 'No endpoints found that support tool use. To learn more about provider routing, visit: https://openrouter.ai/docs/provider-routing', 'code': 404}}

Hope you can look into why this happens specifically for Gemma models. They're a rare breed of google models that are open.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions