Skip to content

Conversation

@dependabot
Copy link

@dependabot dependabot bot commented on behalf of github Aug 15, 2025

Bumps ruby_llm from 1.3.1 to 1.6.2.

Release notes

Sourced from ruby_llm's releases.

1.6.2

RubyLLM 1.6.2: Thinking Tokens & Performance 🧠

Quick maintenance release fixing Gemini's thinking token counting and bringing performance improvements. Plus we're removing capability gatekeeping - trust providers to know what they can do!

🧮 Fixed: Gemini Thinking Token Counting

Gemini 2.5 with thinking mode wasn't counting tokens correctly, leading to incorrect billing calculations:

# Before: Only counted candidatesTokenCount (109 tokens)
# Actual API response had:
#   candidatesTokenCount: 109
#   thoughtsTokenCount: 443
#   => Should be 552 total!
Now: Correctly sums both token types
chat = RubyLLM.chat(model: 'gemini-2.5-flash')
response = chat.ask('What is 2+2? Think step by step.')
response.output_tokens  # => 552 (correctly summed)

This aligns with how all providers bill thinking/reasoning tokens - they're all output tokens. Fixes #346.

🚫 Capability Gatekeeping Removed

We were pre-checking if models support certain features before attempting to use them. But sometimes pre-emptive checks were getting in the way:

# Before 1.6.2: Pre-checked capabilities before attempting
chat.with_tool(MyTool)  # => UnsupportedFunctionsError (without trying)
Now: Let the provider handle it
chat.with_tool(MyTool)  # Works if supported, provider errors if not

Why this approach is better:

  • Direct feedback - Get the actual provider error, not our pre-emptive block
  • Immediate support - New models and features work as soon as providers ship them
  • Custom models - Fine-tuned and custom models aren't artificially limited
  • Simpler flow - One less layer of validation between you and the provider

The provider knows what it can do. If it works, great! If not, you'll get a clear error from the source.

Same philosophy applies to structured output (with_schema).

⚡ Performance Improvements

Thanks to @​tagliala for introducing RuboCop Performance (#316), bringing multiple optimizations:

... (truncated)

Commits
  • 470de29 Bump version to 1.6.2
  • 14b352f Fixes incorrect token counting for Gemini with thinking
  • f54c6cd Remove model existence logging after capability gatekeeping removal
  • acb953a Updated Appraisals
  • 92bcdd5 docs: remove capability gatekeeping
  • 836113f Remove capability gatekeeping - let providers validate their own features
  • ccd7915 Introduce RuboCop Performance (#316)
  • ee0d8c2 Bust gem version cache in README
  • b2855b4 fix: skip flaky test for orphaned tool call messages on JRuby
  • 91eb158 Appraisal update
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [ruby_llm](https://github.com/crmne/ruby_llm) from 1.3.1 to 1.6.2.
- [Release notes](https://github.com/crmne/ruby_llm/releases)
- [Commits](crmne/ruby_llm@1.3.1...1.6.2)

---
updated-dependencies:
- dependency-name: ruby_llm
  dependency-version: 1.6.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added dependencies Pull requests that update a dependency file ruby Pull requests that update ruby code labels Aug 15, 2025
@dependabot @github
Copy link
Author

dependabot bot commented on behalf of github Aug 20, 2025

Superseded by #41.

@dependabot dependabot bot closed this Aug 20, 2025
@dependabot dependabot bot deleted the dependabot/bundler/ruby_llm-1.6.2 branch August 20, 2025 07:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

dependencies Pull requests that update a dependency file ruby Pull requests that update ruby code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

0 participants