Skip to content

Conversation

@keenranger
Copy link

@keenranger keenranger commented Jan 13, 2026

Langgraph not only supports messages streams but also custom. And it is only way to modify what to stream in livekit-langgraph integration.

So for that,

  • Add stream_mode parameter supporting "messages" and "custom" modes
  • Enable multi-mode streaming for LangGraph's StreamWriter output
  • Extend _to_chat_chunk() to handle dict and object inputs

Summary by CodeRabbit

  • New Features
    • Added StreamMode parameter configuration for LangGraph streaming with support for "messages" and "custom" modes
    • Implemented flexible multi-mode streaming support to handle diverse payload formats
    • Enhanced streaming input processing to accept a wider range of data input shapes

✏️ Tip: You can customize this high-level summary in your review settings.

@CLAassistant
Copy link

CLAassistant commented Jan 13, 2026

CLA assistant check
All committers have signed the CLA.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 7b0639d783

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@keenranger keenranger marked this pull request as draft January 13, 2026 15:09
@keenranger keenranger marked this pull request as ready for review January 14, 2026 02:48
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 6eb8409988

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

@keenranger
Copy link
Author

keenranger commented Jan 14, 2026

@davidzhao I saw you’ve reviewed related pr #3112 before.
If you have time, could you take a look at this one as well? Thanks. :)

@keenranger keenranger changed the title Add custom stream mode support in LangChain LLMAdapter feat(langgraph): add custom stream mode support in LangChain LLMAdapter Jan 15, 2026
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Jan 19, 2026

Note

Other AI code review bot(s) detected

CodeRabbit has detected other AI code review bot(s) in this pull request and will avoid duplicating their findings in the review comments. This may lead to a less comprehensive review.

📝 Walkthrough

Walkthrough

Adds StreamMode-based streaming support to the LangGraph integration. Introduces validation and storage of stream_mode parameter in LLMAdapter and LangGraphStream, supporting both single-mode and multi-mode streaming with different chunk emission logic based on mode type.

Changes

Cohort / File(s) Summary
StreamMode Parameter & Validation
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
Added StreamMode import from langgraph.types; defined _SUPPORTED_MODES constant ({"messages", "custom"}); extended LLMAdapter.__init__ and LangGraphStream.__init__ with stream_mode parameter, storing and validating against supported modes
Multi-Mode & Single-Mode Streaming Logic
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
Modified LangGraphStream._run to detect and handle multi-mode (list) vs single-mode streaming; for multi-mode, processes (mode, data) tuples with conditional emission ("custom" emits payload, "messages" extracts token chunk); for single-mode, processes based on configured stream_mode
Chunk Conversion Extension
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
Extended _to_chat_chunk to accept dict inputs with "content" key in addition to string and BaseMessageChunk inputs

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 Hop hop, the stream modes now align,
Custom and messages, both so fine!
Multi-mode dancing, single stride,
LangGraph flows with flexible guide,
Dict chunks welcome, content extracted with care,
Streaming rabbits are everywhere! 🌊

🚥 Pre-merge checks | ✅ 2 | ❌ 1
❌ Failed checks (1 warning)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 14.29% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately describes the main feature added: custom stream mode support for the LangChain LLMAdapter in LangGraph integration.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings

📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 04b4d26 and c9bd9b2.

📒 Files selected for processing (1)
  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit inference engine (AGENTS.md)

**/*.py: Format code with ruff
Run ruff linter and auto-fix issues
Run mypy type checker in strict mode
Maintain line length of 100 characters maximum
Ensure Python 3.9+ compatibility
Use Google-style docstrings

Files:

  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
🧠 Learnings (2)
📓 Common learnings
Learnt from: keenranger
Repo: livekit/agents PR: 4511
File: livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py:46-54
Timestamp: 2026-01-19T07:59:36.851Z
Learning: In the LiveKit LangChain LangGraph integration (`livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py`), passing an empty list for `stream_mode` parameter (i.e., `stream_mode=[]`) is valid and intentional behavior—it allows users to opt out of streaming modes.
Learnt from: CR
Repo: livekit/agents PR: 0
File: AGENTS.md:0-0
Timestamp: 2026-01-16T07:44:56.353Z
Learning: Implement Model Interface Pattern for STT, TTS, LLM, and Realtime models with provider-agnostic interfaces, fallback adapters for resilience, and stream adapters for different streaming patterns
📚 Learning: 2026-01-19T07:59:36.851Z
Learnt from: keenranger
Repo: livekit/agents PR: 4511
File: livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py:46-54
Timestamp: 2026-01-19T07:59:36.851Z
Learning: In the LiveKit LangChain LangGraph integration (`livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py`), passing an empty list for `stream_mode` parameter (i.e., `stream_mode=[]`) is valid and intentional behavior—it allows users to opt out of streaming modes.

Applied to files:

  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: type-check (3.13)
  • GitHub Check: type-check (3.9)
  • GitHub Check: unit-tests
🔇 Additional comments (5)
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py (5)

22-35: Clear supported-mode declaration and typing.


46-54: Validation for unsupported stream_mode is solid.


89-90: Nice propagation of stream_mode into the stream object.


105-117: Storing stream_mode on the stream instance is straightforward.


246-253: Broader payload support in _to_chat_chunk is helpful.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In
`@livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py`:
- Around line 141-175: In the multi-mode loop (is_multi_mode) unexpected items
that are not 2-tuples with a string mode currently fall through into the
single-mode checks and get silently dropped because self._stream_mode is a list;
fix by adding an explicit else/guard in the async for loop to handle unexpected
tuple shapes or non-string modes: when an item is in multi-mode but not
(isinstance(item, tuple) and len(item)==2 and isinstance(mode,str)), call a
diagnostic path (e.g., log a warning via the same logger/context or send an
error chunk) and continue, and ensure you only use the single-mode handling when
not is_multi_mode; update references around is_multi_mode, _stream_mode,
_to_chat_chunk, _extract_message_chunk, and _event_ch.send_nowait to implement
this defensive branch.
🧹 Nitpick comments (1)
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py (1)

50-55: Consider validating non-empty stream_mode.

An empty list stream_mode=[] passes validation (no unsupported modes) but would set is_multi_mode=True in _run() with no modes to match, potentially causing unexpected behavior where all items are silently dropped.

💡 Suggested validation
         modes = {stream_mode} if isinstance(stream_mode, str) else set(stream_mode)
+        if not modes:
+            raise ValueError("stream_mode must specify at least one mode.")
         unsupported = modes - _SUPPORTED_MODES
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 80f2e33 and 15f7e53.

📒 Files selected for processing (2)
  • .claude/settings.json
  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit inference engine (AGENTS.md)

**/*.py: Format code with ruff
Run ruff linter and auto-fix issues
Run mypy type checker in strict mode
Maintain line length of 100 characters maximum
Ensure Python 3.9+ compatibility
Use Google-style docstrings

Files:

  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: unit-tests
  • GitHub Check: type-check (3.9)
  • GitHub Check: type-check (3.13)
🔇 Additional comments (3)
.claude/settings.json (1)

1-10: LGTM!

The Claude settings file appropriately scopes permissions to development tooling (mypy, pytest, make) and restricts web fetches to the LangChain documentation domain. This follows the principle of least privilege.

livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py (2)

34-36: LGTM!

Good design: the broad StreamMode type alias enables future extensibility and type-checker compatibility, while _SUPPORTED_MODES enforces runtime constraints. This separation is clean.


247-254: LGTM!

The extended _to_chat_chunk correctly handles dict and object inputs from StreamWriter. The defensive isinstance(raw, str) checks ensure only valid string content is processed.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Fix all issues with AI agents
In
`@livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py`:
- Around line 46-54: In the LangGraph constructor (__init__), add validation to
reject empty list values for the stream_mode parameter: detect when stream_mode
is a list and is empty (so that modes becomes an empty set) and raise a
ValueError with a clear message (similar style to the existing check that uses
_SUPPORTED_MODES); this prevents downstream logic that checks is_multi_mode
(e.g., is_multi_mode = isinstance(self._stream_mode, list)) from treating an
empty list as multi-mode and stalling event emission. Ensure the check occurs
before computing unsupported = modes - _SUPPORTED_MODES and references
stream_mode/self._stream_mode and _SUPPORTED_MODES so it fails fast on empty
lists.
📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 15f7e53 and 04b4d26.

📒 Files selected for processing (1)
  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
🧰 Additional context used
📓 Path-based instructions (1)
**/*.py

📄 CodeRabbit inference engine (AGENTS.md)

**/*.py: Format code with ruff
Run ruff linter and auto-fix issues
Run mypy type checker in strict mode
Maintain line length of 100 characters maximum
Ensure Python 3.9+ compatibility
Use Google-style docstrings

Files:

  • livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: type-check (3.9)
  • GitHub Check: type-check (3.13)
  • GitHub Check: unit-tests
🔇 Additional comments (4)
livekit-plugins/livekit-plugins-langchain/livekit/plugins/langchain/langgraph.py (4)

79-90: Config propagation looks good.

Passing stream_mode through the adapter keeps stream construction consistent.


94-118: Storing stream_mode on the stream is clear.

Keeps _run() logic straightforward and avoids re-deriving configuration.


119-175: Multi vs single mode handling is clean and readable.

The branching logic is easy to follow and matches the intended behavior.


236-253: Nice normalization for dict/object payloads.

Handling "content" in dicts/objects expands compatibility with custom stream payloads.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants