Skip to content

llm/anthropic, agent, attractor: configurable max_tokens with 64K default#38

Open
thewoolleyman wants to merge 1 commit intodanshapiro:mainfrom
thewoolleyman:configurable-max-tokens
Open

llm/anthropic, agent, attractor: configurable max_tokens with 64K default#38
thewoolleyman wants to merge 1 commit intodanshapiro:mainfrom
thewoolleyman:configurable-max-tokens

Conversation

@thewoolleyman
Copy link

Summary

  • Make max_tokens configurable via graph stylesheet and node attributes instead of hardcoded 4096
  • Raise the default to 65536 to prevent silent output truncation on complex tasks
  • Propagate max_tokens through the agent session, codergen router, and Anthropic adapter
  • Includes design plan in docs/plans/

Test plan

  • Anthropic adapter test verifies new 65536 default
  • Stylesheet and codergen router tests for max_tokens propagation
  • Full test suite passes on macOS and Linux CI

🤖 Generated with Claude Code

…ault

Make max_tokens configurable at three levels (node attr > stylesheet >
provider default) following the existing reasoning_effort pattern. Bump
the Anthropic adapter default from 4096 to 65536 so large tool calls
(e.g. write_file with 80KB+ content) are not truncated mid-JSON.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@thewoolleyman
Copy link
Author

Note: After merging this PR, #41 (test fixes + gofmt + CI) should be merged last — it includes a repo-wide gofmt pass that touches files in this PR. Merging #41 after the others avoids conflicts.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant