Skip to content

fix: pydantic-ai stream LLM tokens as UiPathConversationMessageEvent#206

Merged
cristipufu merged 2 commits intomainfrom
fix/pydantic_ai_streaming_message_events
Mar 2, 2026
Merged

fix: pydantic-ai stream LLM tokens as UiPathConversationMessageEvent#206
cristipufu merged 2 commits intomainfrom
fix/pydantic_ai_streaming_message_events

Conversation

@cristipufu
Copy link
Member

Summary

  • Stream LLM tokens in real-time to the Chat UI using pydantic-ai's ModelRequestNode.stream() + stream_text(delta=True) instead of waiting for the full model response
  • Emit proper UiPathConversationMessageEvent payloads (START → CHUNK(s) → END lifecycle) — fixes 'dict' object has no attribute 'message_id' error in conversational mode
  • Strong typing for node parameters (ModelRequestNode, CallToolsNode, ModelResponse) replacing Any
  • Simplify message schema (remove unused toolCalls/interrupts/citations fields)
  • Bump version to 0.0.3

Test plan

  • All 60 existing tests pass
  • 6 new e2e streaming tests with mocked LLM:
    • Message events have proper message_id attribute
    • START → CHUNK(s) → END lifecycle is correct
    • Concatenated chunks reassemble to full response text
    • content_part_id is consistent across events
    • Agents with tools emit message events for text responses
    • Tool-only turns (no text) correctly skip message events

🤖 Generated with Claude Code

Replace raw dict payloads with proper UiPathConversationMessageEvent objects
following the START -> CHUNK(s) -> END lifecycle. Use node.stream() +
stream_text(delta=True) for real-time token streaming to the Chat UI.

- Stream individual LLM tokens via pydantic-ai's ModelRequestNode.stream()
- Emit proper message lifecycle events (start, content chunks, end)
- Add strong typing for node parameters (ModelRequestNode, CallToolsNode)
- Simplify message schema (remove unused toolCalls/interrupts/citations)
- Add 6 e2e streaming tests with mocked LLM
- Bump version to 0.0.3

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@cristipufu cristipufu force-pushed the fix/pydantic_ai_streaming_message_events branch from 985f57b to 1d16ea4 Compare March 2, 2026 09:27

This comment was marked as outdated.

- Use Agent.is_call_tools_node() / Agent.is_model_request_node() for
  type narrowing instead of private module imports
- Remove all imports from pydantic_ai._agent_graph
- Compare phase against UiPathRuntimeStatePhase.STARTED enum directly
- Run ruff format on all files

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@cristipufu cristipufu merged commit 5cbf811 into main Mar 2, 2026
53 checks passed
@cristipufu cristipufu deleted the fix/pydantic_ai_streaming_message_events branch March 2, 2026 11:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants