If I choose to use "Assist" or "Custom Conversation LLM API", my llama.cpp server returns an error: Failed to parse tools: [json.exception.out_of_range.403] key 'description' not found; tools = ...
I can't imagine what I might be doing wrong here, as the "No control" option works fine, and the extended_openai_conversation integration also works.
If relevant, I'm running:
llama-server --verbose -t 32 --model Qwen3-4B-Instruct-2507-UD-Q8_K_XL.gguf --jinja -c 16000
If I choose to use "Assist" or "Custom Conversation LLM API", my llama.cpp server returns an error:
Failed to parse tools: [json.exception.out_of_range.403] key 'description' not found; tools = ...I can't imagine what I might be doing wrong here, as the "No control" option works fine, and the extended_openai_conversation integration also works.
If relevant, I'm running:
llama-server --verbose -t 32 --model Qwen3-4B-Instruct-2507-UD-Q8_K_XL.gguf --jinja -c 16000