Skip to content

fix: streaming infrastructure — queue, loop condition, and timeout#260

Merged
moonpyt merged 5 commits intoXSpoonAi:mainfrom
veithly:fix/streaming-infrastructure
Feb 25, 2026
Merged

fix: streaming infrastructure — queue, loop condition, and timeout#260
moonpyt merged 5 commits intoXSpoonAi:mainfrom
veithly:fix/streaming-infrastructure

Conversation

@veithly
Copy link
Collaborator

@veithly veithly commented Feb 12, 2026

Summary

  • ThreadSafeOutputQueue.put_nowait(): ToolCallAgent.step() uses put_nowait() but the wrapper class only had async put(), causing AttributeError during streaming.
  • BaseAgent.stream() while-loop: The condition not (done or empty) exited immediately when the queue was initially empty. Fixed to not (done and empty) so it waits for the producer to start.
  • SpoonReactSkill/SpoonReactAI.run() timeout: _run_and_signal_done() passes timeout to run(), but subclasses didn't accept it, causing TypeError.

Test plan

  • tests/test_streaming_fixes.py — 5 tests covering all 3 fixes
  • py_compile on all modified files
  • Integration test with spoon-bot gateway streaming endpoint

Made with Cursor

veithly and others added 5 commits February 13, 2026 00:53
1. ThreadSafeOutputQueue: add put_nowait() method
   - ToolCallAgent.step() uses put_nowait() to push chunks into the
     queue, but ThreadSafeOutputQueue only had an async put().  This
     caused AttributeError at runtime when streaming was active.

2. BaseAgent.stream(): fix while-loop exit condition
   - Old: `while not (done or empty)` → exited immediately when the
     queue was initially empty, even though the producer hadn't
     started yet.
   - New: `while not (done and empty)` → continues until the task
     is complete AND all queued items have been consumed.

3. SpoonReactSkill.run() / SpoonReactAI.run(): accept `timeout` kwarg
   - BaseAgent._run_and_signal_done() passes timeout to run(), but
     SpoonReactSkill and SpoonReactAI overrode run() without the
     timeout parameter, causing TypeError.

Includes tests for all three fixes.

Co-authored-by: Cursor <cursoragent@cursor.com>
…ep()

When an LLM response has no tool calls and a finish_reason, step()
returned early without pushing content to output_queue.  This caused
streaming consumers to receive zero chunks for simple text responses.

Also removes debug print() statements added during investigation.

Co-authored-by: Cursor <cursoragent@cursor.com>
…er issues

XSpoonAi#5 - MCPTool.expand_server_tools() discovers all real server tools and
     creates individual MCPTool instances per tool (e.g. read_file,
     write_file instead of single "filesystem" proxy)
XSpoonAi#8 - SkillScript now accepts input_schema; ScriptTool derives structured
     parameters from it and serializes kwargs to JSON for stdin
XSpoonAi#9 - SkillLoader uses utf-8-sig encoding (strips BOM), normalizes CRLF
     line endings, and deduplicates discovery via resolved paths

Co-authored-by: Cursor <cursoragent@cursor.com>
@moonpyt moonpyt merged commit 8b8a49a into XSpoonAi:main Feb 25, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants