Skip to content

Cancel in-flight LLM stream on thread switch or new chat #251

@theantichris

Description

@theantichris

Description

Running :n, loading a thread from :t, or pressing Escape in normal mode while the LLM is streaming can cause a race condition. The streaming goroutine continues sending LLMResponseMsg/LLMDoneMsg after the chat state has been reset, which can:

  • Append stale response chunks to the new/loaded conversation's chatHistory
  • Persist the assistant message under the wrong (new or empty) threadID
  • Race on model.messages (goroutine writes on line 30 of chat_stream.go)

Proposed solution

Add a per-stream cancellable context and a streaming flag to ChatModel:

  • streamCtx, streamCancel := context.WithCancel(model.ctx) created in startLLMStream()
  • streaming bool set true in startLLMStream(), false in handleLLMDoneMsg/handleLLMErrorMsg/cancel handler
  • A cancelStream() method that calls streamCancel(), clears currentResponse, and sets streaming = false

Cancel trigger points

  • Escape in normal mode while streaming → cancel stream
  • :n (newChat()) → cancel stream, then reset state
  • loadThread() (selecting a thread from :t list) → cancel stream, then load thread

:t itself does not cancel — it just opens the list overlay. The stream can continue underneath until a thread is actually selected.

Post-cancel cleanup

  • handleLLMResponseMsg should early-return if !model.streaming
  • Distinguish intentional cancel from errors (e.g. LLMCancelledMsg vs LLMErrorMsg) so cancelled streams don't show error messages

Split from #247.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    Status

    Backlog

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions