-
Notifications
You must be signed in to change notification settings - Fork 641
Description
System Info
- Intel Mac 2019
- Conda
- latest llama-stack / llama-stack model from repo with togther.ai
- llama-stack-apps latest from repo
Information
- The official example scripts
- My own modified scripts
🐛 Describe the bug
I try to run the examples and it doesn't work even though api is functional.
might be caused by this:
llamastack/llama-stack-client-python#92
Error logs
(stack-app-env) user@MonkeyPro llama-stack-apps % python -m examples.agents.hello localhost 5001
Traceback (most recent call last):
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/Users/user/local/llama-stack-apps/examples/agents/hello.py", line 95, in
fire.Fire(main)
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/fire/core.py", line 135, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/fire/core.py", line 468, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/fire/core.py", line 684, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "/Users/user/local/llama-stack-apps/examples/agents/hello.py", line 30, in main
available_shields = [shield.identifier for shield in client.shields.list()]
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/llama_stack_client/resources/shields.py", line 114, in list
return self._get(
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/llama_stack_client/_base_client.py", line 1209, in get
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/llama_stack_client/_base_client.py", line 955, in request
return self._request(
File "/opt/anaconda3/envs/stack-app-env/lib/python3.10/site-packages/llama_stack_client/_base_client.py", line 1058, in _request
raise self._make_status_error_from_response(err.response) from None
llama_stack_client.NotFoundError: Error code: 404 - {'detail': 'Not Found'}
Expected behavior
The agents hello module run successfully.
NOTE: when I run the following, I get a successful response from the llama-stack:
(stack-app-env) user@MonkeyPro llama-stack % curl http://localhost:5001/v1/models
{"data":[{"identifier":"all-MiniLM-L6-v2","provider_resource_id":"all-MiniLM-L6-v2","provider_id":"sentence-transformers","type":"model","metadata":{"embedding_dimension":384},"model_type":"embedding"},{"identifier":"meta-llama/Llama-3.1-405B-Instruct-FP8","provider_resource_id":"meta-llama/Meta-Llama-3.1-405B-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.1-70B-Instruct","provider_resource_id":"meta-llama/Meta-Llama-3.1-70B-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.1-8B-Instruct","provider_resource_id":"meta-llama/Meta-Llama-3.1-8B-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.2-11B-Vision-Instruct","provider_resource_id":"meta-llama/Llama-3.2-11B-Vision-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.2-3B-Instruct","provider_resource_id":"meta-llama/Llama-3.2-3B-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.2-90B-Vision-Instruct","provider_resource_id":"meta-llama/Llama-3.2-90B-Vision-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-3.3-70B-Instruct","provider_resource_id":"meta-llama/Llama-3.3-70B-Instruct-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-Guard-3-11B-Vision","provider_resource_id":"meta-llama/Llama-Guard-3-11B-Vision-Turbo","provider_id":"together","type":"model","metadata":{},"model_type":"llm"},{"identifier":"meta-llama/Llama-Guard-3-8B","provider_resource_id":"meta-llama/Meta-Llama-Guard-3-8B","provider_id":"together","type":"model","metadata":{},"model_type":"llm"}]}%