Skip to content

Fix finish_reason stop when tracking usage#2641

Merged
dgageot merged 1 commit intodocker:mainfrom
rumpl:fix-openai-usage-event
May 6, 2026
Merged

Fix finish_reason stop when tracking usage#2641
dgageot merged 1 commit intodocker:mainfrom
rumpl:fix-openai-usage-event

Conversation

@rumpl
Copy link
Copy Markdown
Member

@rumpl rumpl commented May 5, 2026

The server now correctly sends usage at the end of the stream the same way openai's API does.

@rumpl rumpl requested a review from a team as a code owner May 5, 2026 20:18
@rumpl
Copy link
Copy Markdown
Member Author

rumpl commented May 5, 2026

With this change docker-agent can connect to a docker-agent that serves an agent

Screenshot 2026-05-05 at 22 19 33

Copy link
Copy Markdown

@docker-agent docker-agent Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Assessment: 🟡 NEEDS ATTENTION

This PR correctly implements the happy-path for stream_options.include_usage: the usage chunk is emitted after the stop chunk and before [DONE], Choices serializes as "choices":[] (no omitempty), and the OpenAPI schema is updated. Two medium-severity interoperability concerns with the OpenAI streaming protocol were found.

Comment thread pkg/chatserver/server.go
Comment thread pkg/chatserver/server.go
@rumpl rumpl force-pushed the fix-openai-usage-event branch 2 times, most recently from 4d04f1a to ef58ef7 Compare May 5, 2026 20:36
krissetto
krissetto previously approved these changes May 5, 2026
Comment thread pkg/chatserver/agent.go Outdated
@@ -251,9 +251,6 @@ func runAgentLoop(ctx context.Context, rt runtime.Runtime, sess *session.Session
// returning nil when nothing is known so we can omit the field entirely
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

so was it intentional, or should we remove/adjust this part of the comment?

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll remove the comment

Comment thread pkg/chatserver/server.go Outdated

if req.Stream {
err := s.streamChatCompletion(c, rt, sess, model)
err := s.streamChatCompletion(c, rt, sess, model, req.StreamOptions != nil && req.StreamOptions.IncludeUsage)
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i gotta say she's a beauty :) can we maybe make a var? it's not entirely sure what it is at a glance

Copy link
Copy Markdown
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'll just change it so that StreamOptions is not a pointer

The server now correctly sends usage at the end of the stream the same
way openai's API does.

Signed-off-by: Djordje Lukic <djordje.lukic@docker.com>
@rumpl rumpl force-pushed the fix-openai-usage-event branch from 64ef8a4 to 5daf796 Compare May 5, 2026 21:39
@dgageot dgageot merged commit 8840c3e into docker:main May 6, 2026
5 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants