Skip to content

[Bug] #22

@pro-nub

Description

@pro-nub

After spending several days trying to merge picoLM into picoclaw, I’ve been unable to get it working. The provided manual is misleading and appears to be written carelessly, making it difficult to follow.

This part should be removed. It does not work at all.

{
"agents": {
"defaults": {
"provider": "picolm",
"model": "picolm-local"
}
},
"providers": {
"picolm": {
"binary": "/.picolm/bin/picolm",
"model": "
/.picolm/models/tinyllama-1.1b-chat-v1.0.Q4_K_M.gguf",
"max_tokens": 256,
"threads": 4,
"template": "chatml"
}
}
}

So, out of frustration, i installed it through the one-liner installation script. now it gives another errror.

;@=============================@
2026/03/07 17:33:26 [2026-03-07T04:33:26Z] [INFO] agent: Created implicit main agent (no agents.list configured)
2026/03/07 17:33:26 [2026-03-07T04:33:26Z] [INFO] agent: Agent initialized {tools_count=12, skills_total=7, skills_available=7}
2026/03/07 17:33:26 [2026-03-07T04:33:26Z] [INFO] agent: Processing message from cli:cron: What is the meaning of life? {session_key=cli:default, channel=cli, chat_id=direct, sender_id=cron}
2026/03/07 17:33:26 [2026-03-07T04:33:26Z] [INFO] agent: Routed message {agent_id=main, session_key=agent:main:main, matched_by=default}
2026/03/07 17:33:27 [2026-03-07T04:33:27Z] [ERROR] agent: LLM call failed {agent_id=main, iteration=1, error=API request failed:
Status: 400
Body: {"error":{"message":"property 'prompt_cache_key' is unsupported","type":"invalid_request_error"}}
}
Error: error processing message: LLM call failed after retries: API request failed:
Status: 400
Body: {"error":{"message":"property 'prompt_cache_key' is unsupported","type":"invalid_request_error"}}

Usage:
picoclaw agent [flags]

Flags:
-d, --debug Enable debug logging
-h, --help help for agent
-m, --message string Send a single message (non-interactive mode)
--model string Model to use
-s, --session string Session key (default "cli:default")

;@=============================@

In particular, this message, "property 'prompt_cache_key' is unsupported" appears like crazy, when i try to connect to a variety of llm ai service providers.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions