fix(ai): resolve race condition in parallel tool execution#11907
Closed
delta575 wants to merge 2 commits intovercel:mainfrom
Closed
fix(ai): resolve race condition in parallel tool execution#11907delta575 wants to merge 2 commits intovercel:mainfrom
delta575 wants to merge 2 commits intovercel:mainfrom
Conversation
When multiple tools execute in parallel, the stream could close prematurely or throw "stream is not in a state that permits enqueue" errors. Two issues caused this: 1. generateId() returned the same value for multiple tools in a batch, causing outstandingToolResults Set to only track one tool 2. Multiple finally() blocks calling attemptClose() simultaneously caused race conditions Fix: - Use toolCall.toolCallId instead of generateId() for unique tracking - Add closed flag to prevent re-entry in attemptClose() - Guard async enqueue calls to prevent enqueueing after stream closure Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
Add tests to verify that: - Using toolCallId for tracking handles parallel tools correctly (exposes bug where generateId returns same value for all tools) - Multiple tools with different delays all complete successfully - Stream doesn't close prematurely when fast tool completes before slow tool - Many parallel tool calls (10+) don't lose results Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
| // close the tool results controller if no more outstanding tool calls | ||
| if (canClose && outstandingToolResults.size === 0) { | ||
| // Mark as closed BEFORE doing any work to prevent race conditions | ||
| // where multiple finally() blocks call attemptClose() simultaneously |
Contributor
lgrammel
reviewed
Jan 22, 2026
| const toolExecutionId = generateId(); // use our own id to guarantee uniqueness | ||
| // Use toolCallId which is unique per tool call from the LLM | ||
| // (generateId() was returning the same value for multiple tools in a batch) | ||
| const toolExecutionId = toolCall.toolCallId; |
Collaborator
There was a problem hiding this comment.
has this fix been ai generated?
Collaborator
There was a problem hiding this comment.
generateId was used intentionally. if you end up with repeated ids that is most likely an issue with the id generator that you pass in
Collaborator
|
This PR mixes 2 issues. The close state issue seems reasonable, but I have doubts regarding the id generation. Please separate the close fix into a new PR. |
This was referenced Jan 22, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Background
When multiple tools execute in parallel during
streamText, the stream could close prematurely or throw "The stream is not in a state that permits enqueue" errors.Context: We're using
@convex-dev/agent0.3.2 which doesn't support AI SDK v6. We wanted to use Gemini 3 Flash, butthought_signatureis required (not optional), so we had to downgrade to Gemini 2.5 where it's optional and AI SDK v5 worked fine. When testing AI SDK v6 compatibility via get-convex/agent#208 in preparation for Gemini 3 support, we encountered this race condition - Gemini 3 Flash aggressively uses parallel tool calls which exposed the bug.Summary
Two issues caused this race condition:
Non-unique tool tracking IDs:
generateId()was returning the same value for multiple tools in a batch, causing theoutstandingToolResultsSet to only track one tool. When that tool completed, the stream closed while others were still running.Re-entry in attemptClose: Multiple
finally()blocks could callattemptClose()simultaneously, causing race conditions when closing the stream.Fix:
toolCall.toolCallIdinstead ofgenerateId()for unique tool trackingclosedflag to prevent re-entry inattemptClose()Manual Verification
Tested with a Convex application using
@convex-dev/agentfrom get-convex/agent#208 and Gemini 3 Flash with 5+ parallel tool calls - all tool results are now captured and streaming completes successfully.Checklist