Skip to content

TUI Renders Truncated LLM Response Despite Full Output in Log File #5006

@taqtiqa-mark

Description

@taqtiqa-mark

Description

Environment

  • OpenCode version: v1.0.128
  • Operating System: Linux (Debian Testing)
  • Installation: Containerized setup via curl script
  • Mode: TUI (no serve-attach)

Steps to Reproduce

  1. Launch OpenCode in TUI mode.
  2. Submit a prompt to the LLM that elicits a detailed response, such as one involving script analysis (e.g., identifying issues in a shell script like vault0.sh).
  3. Examine the response displayed in the TUI interface.
  4. Review the associated log file for the complete session output.

Expected Behavior

The TUI interface should present the entirety of the LLM-generated response, aligning precisely with the content captured in the log file's session summary.

Actual Behavior

The TUI interface exhibits a truncated response, terminating prematurely (e.g., concluding at "prints the usage message when" rather than extending to the full content). In contrast, the log file preserves the complete response. The screenshot provided illustrates this truncation in the TUI, while the full log transcript below confirms the presence of the unabridged output:

INFO 2025-12-03T02:45:00 +0ms service=bus type=session.status publishing
INFO 2025-12-03T02:45:00 +1ms service=session.prompt step=2 sessionID=ses_51e0cfecfffe7hCucYDcjdh1m3 loop
INFO 2025-12-03T02:45:00 +11ms service=bus type=message.updated publishing
INFO 2025-12-03T02:45:00 +1ms service=session.summary body=Root cause: In vault0.sh, the `stop` function unnecessarily prints a usage message when given the valid argument "vault0" (lines 184-187), even though it proceeds to handle the stop operation. This is likely a copy-paste error, as a similar (but incorrect) printf exists in the `start` function.
Proposed prompt to fix: "In vault0.sh, edit the stop function to remove the unnecessary printf that prints the usage message when the argument is 'vault0'." body
INFO 2025-12-03T02:45:00 +0ms service=bus type=message.updated publishing
INFO 2025-12-03T02:45:00 +1ms service=bus type=session.updated publishing
INFO 2025-12-03T02:45:00 +0ms service=bus type=session.diff publishing
INFO 2025-12-03T02:45:00 +1ms service=session.prompt sessionID=ses_51e0cfecfffe7hCucYDcjdh1m3 exiting loop
INFO 2025-12-03T02:45:00 +0ms service=session.compaction pruning
INFO 2025-12-03T02:45:00 +12ms service=session.prompt sessionID=ses_51e0cfecfffe7hCucYDcjdh1m3 cancel
INFO 2025-12-03T02:45:00 +1ms service=bus type=session.status publishing
INFO 2025-12-03T02:45:00 +0ms service=bus type=session.idle publishing
INFO 2025-12-03T02:45:00 +5ms service=session.compaction pruned=0 total=0 found

Impact

This discrepancy compromises the reliability of the TUI for interactive sessions, compelling users to consult log files for complete information, which disrupts efficient workflows and diminishes the tool's usability.

Suggested Resolution

Examine the TUI rendering mechanism to guarantee comprehensive display of LLM responses, potentially by enhancing buffer management or synchronization with log outputs.

OpenCode version

v1.0.128

Steps to reproduce

As above

Screenshot and/or share link

Image

Operating System

Linux (Debian Testing)

Terminal

Alacritty+Zellij

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions