-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy patherror.txt
More file actions
70 lines (68 loc) · 5.29 KB
/
error.txt
File metadata and controls
70 lines (68 loc) · 5.29 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
Running OpenRouter baseline:
task = classify
model = meta-llama/llama-3.3-70b-instruct:free
python : Traceback (most recent call last):
At line:1 char:1
+ python baseline/run_baseline.py --task classify --scenario-id
multimo ...
+
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (Traceback (most recent
call last)::String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
File
"C:\Users\SOHAM\OneDrive\Desktop\VerifAI\baseline\run_baseline.py",
line 283, in <module>
main()
~~~~^^
File
"C:\Users\SOHAM\OneDrive\Desktop\VerifAI\baseline\run_baseline.py",
line 251, in main
result = run_baseline_episode(
task_name=args.task,
...<2 lines>...
api_key=args.api_key,
)
File
"C:\Users\SOHAM\OneDrive\Desktop\VerifAI\baseline\run_baseline.py",
line 148, in run_baseline_episode
agent_text = _generate_with_retry(client, active_model, messages)
File
"C:\Users\SOHAM\OneDrive\Desktop\VerifAI\baseline\run_baseline.py",
line 72, in _generate_with_retry
response = client.chat.completions.create(
model=model,
...<2 lines>...
max_tokens=600,
)
File "C:\Users\SOHAM\AppData\Roaming\Python\Python314\site-packages
\openai\_utils\_utils.py", line 286, in wrapper
return func(*args, **kwargs)
File "C:\Users\SOHAM\AppData\Roaming\Python\Python314\site-packages
\openai\resources\chat\completions\completions.py", line 1192, in
create
return self._post(
~~~~~~~~~~^
"/chat/completions",
^^^^^^^^^^^^^^^^^^^^
...<47 lines>...
stream_cls=Stream[ChatCompletionChunk],
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
)
^
File "C:\Users\SOHAM\AppData\Roaming\Python\Python314\site-packages
\openai\_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts,
stream=stream, stream_cls=stream_cls))
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\SOHAM\AppData\Roaming\Python\Python314\site-packages
\openai\_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from
None
openai.BadRequestError: Error code: 400 - {'error': {'message':
'Provider returned error', 'code': 400, 'metadata': {'raw': '{\n
"error": {\n "code": 400,\n "message": "Developer instruction
is not enabled for models/gemma-3-4b-it",\n "status":
"INVALID_ARGUMENT"\n }\n}\n', 'provider_name': 'Google AI Studio',
'is_byok': False}}, 'user_id': 'user_3BXWV5K1G47MvV55TVGSJP1vKrh'}