Skip to content

fix: preserve partial LLM completion on stream abort#10549

Open
amabito wants to merge 1 commit intocontinuedev:mainfrom
amabito:fix/preserve-partial-on-abort
Open

fix: preserve partial LLM completion on stream abort#10549
amabito wants to merge 1 commit intocontinuedev:mainfrom
amabito:fix/preserve-partial-on-abort

Conversation

@amabito
Copy link

@amabito amabito commented Feb 16, 2026

Summary

  • When a user aborts (Stop) a streaming response, PromptLog.completion was left empty
  • This change accumulates streamed content and persists partial completion on abort so logs/analytics remain accurate

Changes

  • core/llm/streamChat.ts — accumulate chunk content during streaming; on abort, set errorPromptLog.completion before returning
  • core/llm/streamChat.test.ts — 3 test cases: normal abort, immediate abort, MessagePart[] content

Details

  • Covers both legacy slash-command path and standard chat path
  • Handles string and MessagePart[] content types
  • No new dependencies
  • Existing yield/UI streaming behavior unchanged
  • ~16 lines of logic added

Test plan

  • streamChat.test.ts passes (3 cases)
  • Existing tests unaffected
  • Manual: abort mid-stream, verify PromptLog.completion is populated

🤖 Generated with Claude Code


Continue Tasks: ✅ 1 no changes — View all


Summary by cubic

Preserves partial LLM completion when a user stops a streaming response, so PromptLog.completion and analytics stay accurate. Works for both legacy slash-command and standard chat paths.

  • Bug Fixes
    • Accumulates streamed chunks and persists partial text on abort (supports string and MessagePart[] content).
    • Adds Vitest unit tests for content extraction and accumulation, including mixed content.

Written for commit 87fb47a. Summary will update on new commits.

@amabito amabito requested a review from a team as a code owner February 16, 2026 13:09
@amabito amabito requested review from RomneyDa and removed request for a team February 16, 2026 13:09
@dosubot dosubot bot added the size:L This PR changes 100-499 lines, ignoring generated files. label Feb 16, 2026
@github-actions
Copy link

github-actions bot commented Feb 16, 2026

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@amabito
Copy link
Author

amabito commented Feb 16, 2026

I have read the CLA Document and I hereby sign the CLA

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 2 files

@amabito
Copy link
Author

amabito commented Feb 16, 2026

I have read the CLA Document and I hereby sign the CLA

@amabito
Copy link
Author

amabito commented Feb 16, 2026

recheck

When a user aborts an LLM streaming request, the partial completion
was lost (errorPromptLog.completion was empty ""). This adds an
accumulatedCompletion variable that tracks streamed content and
preserves it on abort, supporting both string and MessagePart[] content.
@amabito amabito force-pushed the fix/preserve-partial-on-abort branch from dc647d7 to 87fb47a Compare February 16, 2026 13:35
@amabito
Copy link
Author

amabito commented Feb 16, 2026

I have read the CLA Document and I hereby sign the CLA

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

size:L This PR changes 100-499 lines, ignoring generated files.

Projects

Status: Todo

Development

Successfully merging this pull request may close these issues.

1 participant