Skip to content

Conversation

@ahyatt
Copy link
Owner

@ahyatt ahyatt commented Dec 24, 2025

This should fix #224

@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt temporarily deployed to Continuous Integration December 24, 2025 16:39 — with GitHub Actions Inactive
@ahyatt ahyatt requested a review from Copilot December 24, 2025 16:39
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR fixes two related issues in Ollama's streaming tool call implementation: incorrect data handling in the tool collection method and incorrect capability reporting.

Key Changes:

  • Fixed llm-provider-collect-streaming-tool-uses to correctly handle a single tool call instead of trying to map over it as a list
  • Moved streaming-tool-use capability to only be reported when the model actually supports tool-use

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated no comments.

File Description
llm-ollama.el Corrected streaming tool call data extraction to handle single tool use and fixed capability reporting to only include streaming-tool-use when tool-use is supported
NEWS.org Added changelog entries documenting the fixes for version 0.28.3

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

@ahyatt ahyatt merged commit beb296a into main Dec 24, 2025
13 checks passed
@ahyatt ahyatt deleted the ahyatt-ollama-streaming-tc-fix branch December 24, 2025 20:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Streaming tool use not working with ollama

2 participants