fix: preserve partial LLM completion on stream abort#10549
Open
amabito wants to merge 1 commit intocontinuedev:mainfrom
Open
fix: preserve partial LLM completion on stream abort#10549amabito wants to merge 1 commit intocontinuedev:mainfrom
amabito wants to merge 1 commit intocontinuedev:mainfrom
Conversation
|
All contributors have signed the CLA ✍️ ✅ |
Author
|
I have read the CLA Document and I hereby sign the CLA |
Author
|
I have read the CLA Document and I hereby sign the CLA |
Author
|
recheck |
When a user aborts an LLM streaming request, the partial completion was lost (errorPromptLog.completion was empty ""). This adds an accumulatedCompletion variable that tracks streamed content and preserves it on abort, supporting both string and MessagePart[] content.
dc647d7 to
87fb47a
Compare
Author
|
I have read the CLA Document and I hereby sign the CLA |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
PromptLog.completionwas left emptyChanges
core/llm/streamChat.ts— accumulate chunk content during streaming; on abort, seterrorPromptLog.completionbefore returningcore/llm/streamChat.test.ts— 3 test cases: normal abort, immediate abort, MessagePart[] contentDetails
stringandMessagePart[]content typesTest plan
streamChat.test.tspasses (3 cases)🤖 Generated with Claude Code
Continue Tasks: ✅ 1 no changes — View all
Summary by cubic
Preserves partial LLM completion when a user stops a streaming response, so PromptLog.completion and analytics stay accurate. Works for both legacy slash-command and standard chat paths.
Written for commit 87fb47a. Summary will update on new commits.