-
Notifications
You must be signed in to change notification settings - Fork 6.9k
tweak(batch): up restrictive max batch tool from 10 to 25
#9275
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
tweak(batch): up restrictive max batch tool from 10 to 25
#9275
Conversation
…alues - fix: stagger batch firing - tweak: max 25 tools
|
Hey! Your PR title Please update it to start with one of:
Where See CONTRIBUTING.md for details. |
|
The following comment was made by an LLM, it may be inaccurate: No duplicate PRs found |
10 to 25 - add tool execution stagger (50ms) so it looks cooler
packages/opencode/src/tool/batch.ts
Outdated
| } | ||
|
|
||
| const results = await Promise.all(toolCalls.map((call) => executeCall(call))) | ||
| const delay = (ms: number) => new Promise((resolve) => setTimeout(resolve, ms)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
u should just Bun.sleep
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done 👌
|
wait... i'm experiencing batch edits applied in duplicate, needs more thorough checking to verify if this was a model thing or due to the staggered tool firing paired up with the tool duplication bug of batch. putting this one as DRAFT for now |
10 to 25 - add tool execution stagger (50ms) so it looks cooler10 to 25
|
Confirmed, staggering triggers edit in duplicate, reveals an underlying issue that this PR is not addressing. |
What does this PR do?
This PR ups the max tool number batch can accept from 10 to 25 (I had originally set 10 thinking it was fine, but I now realise that it was somewhat arbitrary and doesn't necessarily plays well with models such as GPT 5.x
This PR also staggers the tool calling by 50ms, which gives a much better visual experience of the batch tool (UX win for a maximum of 1sec additional latency on the whole batch execution)
I initially had hopes it would fix the random tool duplication that happens when using batch, but this one is still out there...
How did you verify your code works?
Used it, works as expected.