Skip to content

pyisolate 0.10.1#12

Merged
pollockjj merged 4 commits intoComfy-Org:mainfrom
pollockjj:develop
Apr 5, 2026
Merged

pyisolate 0.10.1#12
pollockjj merged 4 commits intoComfy-Org:mainfrom
pollockjj:develop

Conversation

@pollockjj
Copy link
Copy Markdown
Collaborator

pyisolate 0.10.1

Bugfix release restoring degraded functionality. 4 commits:

  1. chore: docs and infrastructure cleanup — Remove Sphinx, move benchmarks, rewrite README/CLAUDE, update bootstrap to [dev,test], remove obsolete docs.yml, remove ineffective -v CI flags
  2. feat: restore working examples with integration tests — Three self-contained examples (torch_share, sealed_worker, bwrap_torch_share) each loading their own extension module. Integration tests for all three isolation modes. Fixture try/finally safety.
  3. fix: graceful RPC shutdown — socket.SHUT_RDWR before close, sealed_worker auto-inject, version injection with error handling
  4. chore: bump version to 0.10.1 — Version strings + sealed worker test uses pollockjj.github.io/wheels/ index, @pytest.mark.network marker, dynamic version assertion

Remove Sphinx build infrastructure and docs.yml workflow. Move benchmark
scripts into benchmarks/ directory. Replace stale YAML manifests with
inline ExtensionConfig in examples. Rewrite CLAUDE.md and README.md to
remove application-specific references and clean library comments.
Update bootstrap command to [dev,test]. Add language tag to code block.
Remove ineffective -v flags from CI workflow example steps.
Replace stale single-example directory with three self-contained examples
(torch_share, sealed_worker, bwrap_torch_share), each with its own
host.py and extension module. Each example loads its own extension —
not the test harness. Add integration tests (test_rpc_coverage.py)
covering RPC lifecycle for all three isolation modes. Fixture cleanup
uses try/finally for leak safety. Dead code removed.
…to-inject

Fix RPC shutdown sequence to use socket.SHUT_RDWR before close,
preventing lingering connections. Auto-inject sealed_worker bootstrap
path when execution_model is sealed_worker. Use _detect_pyisolate_version()
with error handling instead of raw importlib.metadata call for version
injection in sealed_worker dependencies.
Copilot AI review requested due to automatic review settings April 5, 2026 06:52
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 5, 2026

📝 Walkthrough

Walkthrough

This PR removes Sphinx documentation infrastructure, restructures example code from multi-extension orchestration patterns to focused torch_share and sealed_worker demonstrations, improves RPC shutdown thread-safety, and bumps the package version to 0.10.1 with Python 3.10+ as the minimum supported version.

Changes

Cohort / File(s) Summary
Documentation Infrastructure Removal
.github/workflows/docs.yml, docs/Makefile, docs/conf.py, docs/api.rst, docs/index.rst
Removed entire Sphinx documentation build pipeline, configuration, and generated reference pages.
Documentation Content Updates
README.md, CLAUDE.md, docs/debugging.md, docs/edge_cases.md, docs/rpc_protocol.md
Updated documentation to reflect generic (non-ComfyUI) focus, clarified JSON-RPC as primary transport, adjusted TensorKeeper retention default (30s → 5s), updated minimum Python version to 3.10+, and removed ComfyUI-specific guidance.
Example Cleanup
example/extensions/extension{1,2,3}/__init__.py, example/extensions/extension{1,2,3}/manifest.yaml, example/shared/__init__.py, example/host.py, example/main.py, README_COMFYUI.md
Removed multi-extension orchestration pattern (Extension1/2/3, DatabaseSingleton, ExampleExtension base classes) and ComfyUI-specific documentation.
New torch_share Example
example/torch_share/extension/__init__.py, example/torch_share/host.py
Added new example demonstrating PyTorch tensor sharing over IPC with TorchShareExtension and host registration via MinimalAdapter.
New sealed_worker Example
example/sealed_worker/extension/__init__.py, example/sealed_worker/host.py
Added new example demonstrating sealed-worker isolation mode with version introspection and torch absence detection.
New bwrap Example
example/bwrap_torch_share/host.py
Added Linux-only example combining bubblewrap sandboxing with torch tensor sharing.
Workflow Updates
.github/workflows/pytorch.yml, .github/workflows/windows.yml
Updated PyTorch CI workflows to invoke example/torch_share/host.py instead of legacy example/main.py -v pattern.
RPC Protocol & Transport Hardening
pyisolate/_internal/rpc_protocol.py, pyisolate/_internal/rpc_transports.py
Enhanced AsyncRPC.shutdown() with explicit thread-safe sequence (enqueue stop, close transport, join threads); added socket shutdown before close in JSONSocketTransport; improved _recv_thread unblock handling during shutdown.
Environment & Core Updates
pyisolate/_internal/environment.py, pyisolate/_internal/host.py, pyisolate/_internal/client.py, pyisolate/sealed.py, pyisolate/interfaces.py, pyisolate/__init__.py, pyproject.toml
Bumped version to 0.10.1, refined sealed_worker dependency resolution to use pinned package version, updated generic/non-ComfyUI documentation and comments throughout.
Test Additions
tests/integration_v2/test_rpc_coverage.py, tests/test_uv_sealed_integration.py
Added comprehensive RPC roundtrip tests (torch_share, sealed_worker, bwrap variants) and extended sealed runtime tests with venv pip verification.
Configuration
pyproject.toml, tox.ini, benchmarks/README.md
Bumped version to 0.10.1, dropped Python 3.9 from test environments (now 3.10+), added benchmark runner script documentation.

Sequence Diagram(s)

sequenceDiagram
    participant Host as Host Process
    participant Ext as Extension Process
    participant Transport as JSONSocketTransport
    participant RPC as AsyncRPC

    Host->>RPC: create AsyncRPC(transport)
    Host->>RPC: run_until_stopped() [spawn _send/_recv threads]
    
    rect rgb(100, 150, 200, 0.5)
    Note over RPC: Normal RPC Operations
    Host->>RPC: call proxy.ping()
    RPC->>Transport: send JSON-RPC request
    Transport->>Ext: transmit length-prefixed JSON
    Ext->>Transport: send JSON-RPC response
    Transport->>RPC: receive response
    RPC->>Host: return "pong"
    end

    rect rgb(200, 100, 100, 0.5)
    Note over RPC: Explicit Shutdown Sequence
    Host->>RPC: shutdown()
    RPC->>RPC: set _stopping flag
    RPC->>RPC: enqueue None into outbox
    RPC->>Transport: close() [socket.shutdown() + close]
    Transport->>Transport: unblock _send_thread
    Transport->>Transport: unblock _recv_thread
    RPC->>RPC: resolve blocking_future
    RPC->>RPC: join threads (best-effort)
    RPC->>Host: shutdown complete
    end
Loading

Possibly related PRs

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
✨ Simplify code
  • Create PR with simplified code

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 5, 2026

Codecov Report

❌ Patch coverage is 84.21053% with 3 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
pyisolate/_internal/rpc_protocol.py 78.57% 2 Missing and 1 partial ⚠️
Flag Coverage Δ
unittests 63.69% <84.21%> (+3.14%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
pyisolate/__init__.py 73.33% <100.00%> (+1.90%) ⬆️
pyisolate/_internal/client.py 80.95% <ø> (-2.66%) ⬇️
pyisolate/_internal/environment.py 68.87% <100.00%> (+2.20%) ⬆️
pyisolate/_internal/host.py 69.66% <ø> (+9.04%) ⬆️
pyisolate/_internal/rpc_transports.py 32.71% <100.00%> (-0.76%) ⬇️
pyisolate/_internal/sandbox.py 72.87% <ø> (+10.75%) ⬆️
pyisolate/interfaces.py 100.00% <ø> (ø)
pyisolate/sealed.py 39.83% <ø> (ø)
pyisolate/_internal/rpc_protocol.py 57.06% <78.57%> (+1.36%) ⬆️

... and 4 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Release-focused update for pyisolate 0.10.1 that cleans up docs/infra, restores runnable examples with broader integration testing, and improves RPC shutdown behavior to avoid hangs and noisy teardown failures.

Changes:

  • Bump version to 0.10.1 and align supported Python to 3.10+.
  • Add new self-contained examples + new RPC integration coverage tests.
  • Improve RPC/socket shutdown behavior and sealed-worker dependency injection.

Reviewed changes

Copilot reviewed 41 out of 46 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
tox.ini Drops py3.9 from tox envlist (py3.10+ focus).
pyproject.toml Version bump to 0.10.1 (and requires-python already >=3.10).
pyisolate/init.py Updates __version__ to 0.10.1.
README.md Simplifies docs, updates requirements to Python 3.10+, clarifies features/execution models.
README_COMFYUI.md Removes ComfyUI-specific README.
CLAUDE.md Updates repo maintenance guidance and architecture notes for 0.10.1.
pyisolate/interfaces.py Removes ComfyUI-specific wording from public protocol docs.
pyisolate/sealed.py Removes ComfyUI-specific wording from sealed worker wrapper docs.
pyisolate/_internal/client.py Removes ComfyUI-specific wording from bootstrap comment.
pyisolate/_internal/host.py Removes ComfyUI-specific wording from Windows + GPU comments.
pyisolate/_internal/sandbox.py Removes app-specific wording from sandbox comments.
pyisolate/_internal/environment.py Changes sealed-worker dependency injection behavior (install pyisolate by version).
pyisolate/_internal/rpc_transports.py Ensures sockets shutdown before close for cleaner teardown.
pyisolate/_internal/rpc_protocol.py More robust shutdown: unblock send/recv threads and join threads; unblock run_until_stopped on recv failures.
tests/test_uv_sealed_integration.py Marks as network, adds child-venv verification that pyisolate/version are installed.
tests/integration_v2/test_rpc_coverage.py Adds integration tests exercising key AsyncRPC lifecycle paths across modes.
example/torch_share/host.py New standalone host-coupled torch_share example runner.
example/torch_share/extension/init.py New torch_share child extension module.
example/sealed_worker/host.py New sealed_worker example runner.
example/sealed_worker/extension/init.py New sealed_worker child extension module.
example/bwrap_torch_share/host.py New Linux-only bwrap + torch_share example runner.
example/main.py Removes legacy example entrypoint.
example/host.py Removes legacy multi-extension example host.
example/shared/init.py Removes legacy shared example helpers.
example/extensions/extension1/manifest.yaml Removes legacy example extension1 manifest.
example/extensions/extension1/init.py Removes legacy example extension1 implementation.
example/extensions/extension2/manifest.yaml Removes legacy example extension2 manifest.
example/extensions/extension2/init.py Removes legacy example extension2 implementation.
example/extensions/extension3/manifest.yaml Removes legacy example extension3 manifest.
example/extensions/extension3/init.py Removes legacy example extension3 implementation.
docs/rpc_protocol.md Updates transport-layer documentation to match current primary transport.
docs/debugging.md Updates TensorKeeper retention default to 5.0s in docs.
docs/edge_cases.md Updates TensorKeeper retention default to 5.0s in docs.
docs/Makefile Removes Sphinx Makefile.
docs/conf.py Removes Sphinx config.
docs/index.rst Removes Sphinx index.
docs/api.rst Removes Sphinx autodoc page.
.github/workflows/pytorch.yml Updates example run command to new torch_share host script.
.github/workflows/windows.yml Updates example run command to new torch_share host script.
.github/workflows/docs.yml Removes Sphinx docs build/deploy workflow.
benchmarks/README.md Documents new benchmark runner scripts and instructions doc.
benchmarks/INSTRUCTIONS.md Adds benchmark run instructions.
benchmarks/run_benchmarks_linux.sh Adds Linux/macOS benchmark runner.
benchmarks/run_benchmarks_windows.bat Adds Windows CMD benchmark runner.
benchmarks/run_benchmarks_windows.ps1 Adds Windows PowerShell benchmark runner.
benchmarks/run_benchmarks_powershell_launcher.bat Adds wrapper to help run PS script under execution policy constraints.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 380 to 385
safe_deps: list[str] = []
if config.get("execution_model") == "sealed_worker":
safe_deps.append(str(Path(__file__).resolve().parents[2]))
safe_deps.append(f"pyisolate=={pyisolate_version}")
for dep in config["dependencies"]:
validate_dependency(dep)
safe_deps.append(dep)
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In sealed_worker mode this auto-injects pyisolate=={pyisolate_version}. _detect_pyisolate_version() currently falls back to "0.0.0" when the distribution metadata isn’t available (e.g., running from a source checkout via sys.path), which will make dependency installation fail because pyisolate==0.0.0 can’t be satisfied. Consider deriving the version from pyisolate.__version__ as a fallback and/or skipping injection when the version can’t be reliably determined or when the user already provided a pyisolate dependency (editable/path) in config['dependencies'].

Copilot uses AI. Check for mistakes.
Comment on lines +100 to +104
if os.name == "nt":
child_python = Path(ext.venv_path) / "Scripts" / "python.exe"
else:
child_python = Path(ext.venv_path) / "bin" / "python3"
pip_show = subprocess.run(
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test assumes the child venv has bin/python3, but venv/uv venv doesn’t universally create a python3 shim (whereas bin/python is expected and is what the extension launcher uses). Using bin/python (or probing for python, python3, python3.X) would make this assertion more robust across platforms/distros.

Copilot uses AI. Check for mistakes.
Comment on lines 1 to 5
[tox]
envlist = py{39,310,311,312}, lint, type
envlist = py{310,311,312}, lint, type
isolated_build = True

[testenv]
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[testenv:docs] is still defined below, but the Sphinx config (docs/conf.py, docs/index.rst, etc.) is removed in this PR. As-is, tox -e docs will fail. Either remove the docs tox env or update it to whatever the new docs build process is (or explicitly drop docs support in tox).

Copilot uses AI. Check for mistakes.
…l index

Bump version strings to 0.10.1 in pyproject.toml and __init__.py.
Sealed worker integration test uses pollockjj.github.io/wheels/ index
for pyisolate resolution instead of PyPI. Add @pytest.mark.network
marker. Replace hardcoded version assertion with dynamic lookup.
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 13

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
pyisolate/_internal/environment.py (1)

235-242: 🧹 Nitpick | 🔵 Trivial

Consider improving error messaging when version detection fails.

If importlib_metadata.version("pyisolate") fails (e.g., during development without a proper install), pyisolate_version becomes "0.0.0", and the subsequent pip install pyisolate==0.0.0 will fail with a confusing "no matching distribution" error.

This might trip up developers—a stumble that's a bit of a fumble! Consider logging a warning when falling back to "0.0.0" to clarify the root cause.

💡 Suggested improvement
 def _detect_pyisolate_version() -> str:
     try:
         return importlib_metadata.version("pyisolate")
     except Exception:
+        logger.warning(
+            "Could not detect pyisolate version from package metadata. "
+            "sealed_worker bootstrap may fail if pyisolate is not properly installed."
+        )
         return "0.0.0"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyisolate/_internal/environment.py` around lines 235 - 242,
_detect_pyisolate_version currently swallows all exceptions and returns "0.0.0",
which leads to confusing pip errors; modify _detect_pyisolate_version to catch
the exception from importlib_metadata.version("pyisolate") and emit a clear
warning (including the exception message) before returning the fallback value
used by pyisolate_version (or choose a more explicit fallback like None),
referencing the symbols _detect_pyisolate_version, importlib_metadata.version,
and pyisolate_version so the log explains that version detection failed (e.g.,
development/install context) and shows the underlying error to aid debugging.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@CLAUDE.md`:
- Line 51: The README contains a hardcoded "~50 test functions" count that will
become stale; edit CLAUDE.md to remove or generalize that phrase (e.g., replace
"~50 test functions" with "multiple test functions", "dozens of tests", or omit
the count entirely) so the description remains accurate without requiring
updates when tests change; search for the literal "~50 test functions" in
CLAUDE.md and update the sentence accordingly.
- Line 7: The README hardcodes the pyisolate release string "v0.10.1", which
will become stale; update CLAUDE.md by removing the fixed version token and
either (a) omit the version entirely and keep "PyPI: `pyisolate`", or (b)
replace the literal "v0.10.1" with a dynamic reference such as a PyPI/latest
badge or wording like "latest" that links to the PyPI page; update the line
containing the symbol pyisolate and the string "v0.10.1" accordingly.
- Around line 109-117: Update the environment variables table to reflect the
actual, actively-used variables: add entries for PYISOLATE_HOST_SNAPSHOT,
PYISOLATE_MODULE_PATH, PYISOLATE_UV_CACHE_DIR, PYISOLATE_UDS_ADDRESS,
PYISOLATE_EXTENSION, PYISOLATE_IMPORT_TORCH, PYISOLATE_FORCE_CUDA_IPC,
PYISOLATE_PURGE_SENDER_SHM, PYISOLATE_SIGNAL_CLEANUP,
PYISOLATE_CPU_BORROWED_SHM, and PYISOLATE_CPU_TENSOR_FORCE_CLONE_ON_DESERIALIZE
with concise purpose descriptions matching their usage in the codebase (e.g.,
host snapshot, module search paths, UV cache dir, UDS address, extension
loading, torch import control, CUDA IPC forcing, SHM purge control, signal
cleanup, CPU SHM management, and CPU tensor clone-on-deserialize behavior);
remove PYISOLATE_PATH_DEBUG and PYISOLATE_ENFORCE_SANDBOX from the main table
(or move them to a clearly labeled legacy/experimental section) so the
documentation matches actual implementation.

In `@docs/rpc_protocol.md`:
- Around line 221-230: Update the Overview to state that JSONSocketTransport is
the primary/default transport and that QueueTransport is a
legacy/backward-compatibility path only (not a co-equal default); specifically
modify the Overview wording to mirror the JSONSocketTransport/QueueTransport
descriptions so QueueTransport is explicitly labeled "legacy" and not presented
as an alternative default, referencing the transport names JSONSocketTransport
and QueueTransport and ensuring the overview language aligns with the detailed
sections.

In `@example/bwrap_torch_share/host.py`:
- Around line 70-75: The temp dir created by tempfile.mkdtemp (assigned to tmp)
is never removed, causing resource leaks; update the finally block to remove tmp
and its contents (including venv_root and shared_tmp) using shutil.rmtree(tmp,
ignore_errors=True) after restoring any modified environment variables (TMPDIR)
to ensure cleanup; reference the tempfile.mkdtemp call that sets tmp, the
venv_root and shared_tmp paths, and the environment key "TMPDIR" so you remove
the entire temporary tree and avoid leaving orphaned directories.

In `@example/sealed_worker/host.py`:
- Around line 56-58: The temp dir created by
tempfile.mkdtemp(prefix="sealed_worker_example_") (variable tmp) is never
removed; update host.py to import shutil at the top, ensure tmp is defined in
the outer scope (e.g., tmp = None before the try) and in the existing finally
block call shutil.rmtree(tmp, ignore_errors=True) (or conditional
shutil.rmtree(tmp) only if tmp is truthy) to remove the temp directory that
contains venv_root; reference the tmp and venv_root variables so cleanup runs
regardless of earlier failures.

In `@example/torch_share/host.py`:
- Around line 60-67: The temp workspace created with tmp (and derived venv_root
and shared_tmp) is not removed, leaving stale directories; update the code that
creates tmp/venvs/ipc_shared to ensure cleanup in a finally block by removing
tmp (e.g., shutil.rmtree(tmp, ignore_errors=True) or equivalent) after use, and
apply the same cleanup for the similar allocation at the later block (lines
around the second tmp/venv/shared_tmp creation) so both temporary workspaces are
reliably deleted even on exceptions.

In `@README.md`:
- Around line 92-94: The README shows enabling CUDA IPC via share_cuda_ipc=True
but the host initialization logic (in pyisolate/_internal/host.py, inside the
host init routine) reads config.get("n", False), causing the flag to be ignored;
update the host init to read the same config key used by the README (e.g.,
"share_cuda_ipc" or map the README name to the existing internal key) and use
that boolean to enable CUDA IPC, or alternatively change the README to document
the existing key; modify the config access in the host init function and any
related conditionals that enable CUDA IPC so the runtime honors the documented
share_cuda_ipc flag.
- Line 199: The transport feature line in README.md is too Linux-specific;
update the line that currently reads "Bidirectional JSON-RPC over Unix Domain
Sockets" to mention the Windows TCP fallback so it matches the transport docs
(e.g., reference the phrase "Bidirectional JSON-RPC over Unix Domain Sockets"
and change it to include "or TCP on Windows" / "UDS or TCP on Windows"). Locate
and edit that exact feature line in README.md so the wording aligns with the
transport documentation.
- Around line 126-132: The README currently documents an execution model named
"host-coupled" which doesn't match the allowed values in ExtensionConfig; update
the text to use the actual accepted value "subprocess" (the config field is
execution_model on ExtensionConfig) and ensure the two bullets read "subprocess"
(default): Child process shares the host's torch runtime... and "sealed_worker":
Fully isolated child..., so examples and docs match the accepted API strings.

In `@tests/integration_v2/test_rpc_coverage.py`:
- Around line 18-23: The helper _run currently creates and closes an event loop
manually; replace its body to call asyncio.run(coro) so the function delegates
lifecycle management to asyncio.run (Python 3.7+)—update the _run function to
return asyncio.run(coro) and remove manual loop creation/close while keeping the
same function name to preserve call sites.
- Line 77: The dependency entry uses str(pyisolate_root) which is inconsistent
with the editable install format used elsewhere (e.g., f"-e {pyisolate_root}");
update the dependencies list in the test (the dependencies=[...] usage
referencing pyisolate_root) to use the same format as examples by replacing
str(pyisolate_root) with f"-e {pyisolate_root}" (or alternatively add a short
comment explaining why a non-editable install is intentionally used if you
choose to keep str(pyisolate_root)).

In `@tests/test_uv_sealed_integration.py`:
- Around line 90-117: The test should first assert that the pip_show subprocess
succeeded by checking pip_show.returncode and, on failure, raise/pytest.fail
with pip_show.stderr to make failures explicit; then parse pip_show.stdout for
the exact "Version: <value>" line (e.g. find the line starting with "Version:"
or use a regex to capture the version token) and compare that captured version
string for equality with expected_version instead of using a substring
containment check; update references in the test to use pip_show,
expected_version, and child_python when locating the change.

---

Outside diff comments:
In `@pyisolate/_internal/environment.py`:
- Around line 235-242: _detect_pyisolate_version currently swallows all
exceptions and returns "0.0.0", which leads to confusing pip errors; modify
_detect_pyisolate_version to catch the exception from
importlib_metadata.version("pyisolate") and emit a clear warning (including the
exception message) before returning the fallback value used by pyisolate_version
(or choose a more explicit fallback like None), referencing the symbols
_detect_pyisolate_version, importlib_metadata.version, and pyisolate_version so
the log explains that version detection failed (e.g., development/install
context) and shows the underlying error to aid debugging.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: d7331ae2-7774-42b5-af09-19dae608a9e1

📥 Commits

Reviewing files that changed from the base of the PR and between 51bb318 and 51eb0e6.

📒 Files selected for processing (46)
  • .github/workflows/docs.yml
  • .github/workflows/pytorch.yml
  • .github/workflows/windows.yml
  • CLAUDE.md
  • README.md
  • README_COMFYUI.md
  • benchmarks/INSTRUCTIONS.md
  • benchmarks/README.md
  • benchmarks/run_benchmarks_linux.sh
  • benchmarks/run_benchmarks_powershell_launcher.bat
  • benchmarks/run_benchmarks_windows.bat
  • benchmarks/run_benchmarks_windows.ps1
  • docs/Makefile
  • docs/api.rst
  • docs/conf.py
  • docs/debugging.md
  • docs/edge_cases.md
  • docs/index.rst
  • docs/rpc_protocol.md
  • example/bwrap_torch_share/host.py
  • example/extensions/extension1/__init__.py
  • example/extensions/extension1/manifest.yaml
  • example/extensions/extension2/__init__.py
  • example/extensions/extension2/manifest.yaml
  • example/extensions/extension3/__init__.py
  • example/extensions/extension3/manifest.yaml
  • example/host.py
  • example/main.py
  • example/sealed_worker/extension/__init__.py
  • example/sealed_worker/host.py
  • example/shared/__init__.py
  • example/torch_share/extension/__init__.py
  • example/torch_share/host.py
  • pyisolate/__init__.py
  • pyisolate/_internal/client.py
  • pyisolate/_internal/environment.py
  • pyisolate/_internal/host.py
  • pyisolate/_internal/rpc_protocol.py
  • pyisolate/_internal/rpc_transports.py
  • pyisolate/_internal/sandbox.py
  • pyisolate/interfaces.py
  • pyisolate/sealed.py
  • pyproject.toml
  • tests/integration_v2/test_rpc_coverage.py
  • tests/test_uv_sealed_integration.py
  • tox.ini
💤 Files with no reviewable changes (15)
  • docs/api.rst
  • example/extensions/extension1/manifest.yaml
  • example/extensions/extension2/manifest.yaml
  • example/extensions/extension3/manifest.yaml
  • docs/index.rst
  • docs/conf.py
  • docs/Makefile
  • README_COMFYUI.md
  • example/main.py
  • example/extensions/extension2/init.py
  • example/extensions/extension3/init.py
  • example/extensions/extension1/init.py
  • example/shared/init.py
  • example/host.py
  • .github/workflows/docs.yml

## Identity

**pyisolate** is a Python library for running extensions across multiple isolated virtual environments with RPC communication. It solves dependency conflicts by isolating extensions in separate venvs while maintaining seamless host-extension communication through AsyncRPC.
**pyisolate** is a Python library (PyPI: `pyisolate`, v0.10.1) for running extensions in isolated virtual environments with seamless inter-process communication. It provides dependency isolation, zero-copy tensor transfer, and bubblewrap sandboxing for GPU-heavy workloads.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Hardcoded version will need updating with each release.

The version string "v0.10.1" is hardcoded and will become stale after the next release. Consider whether this maintenance burden is acceptable or if the version should be referenced dynamically or omitted.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` at line 7, The README hardcodes the pyisolate release string
"v0.10.1", which will become stale; update CLAUDE.md by removing the fixed
version token and either (a) omit the version entirely and keep "PyPI:
`pyisolate`", or (b) replace the literal "v0.10.1" with a dynamic reference such
as a PyPI/latest badge or wording like "latest" that links to the PyPI page;
update the line containing the symbol pyisolate and the string "v0.10.1"
accordingly.

| Directory | Purpose |
|-----------|---------|
| `tests/` | Unit and integration tests (pytest, ~50 test functions) |
| `example/` | Working 3-extension demo showing dependency isolation (numpy 1.x vs 2.x) |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Hardcoded test count will drift, to wit!

The "~50 test functions" is a hardcoded count that will become stale as tests are added or removed. Consider whether this number adds value or if it should be omitted to reduce maintenance burden.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` at line 51, The README contains a hardcoded "~50 test functions"
count that will become stale; edit CLAUDE.md to remove or generalize that phrase
(e.g., replace "~50 test functions" with "multiple test functions", "dozens of
tests", or omit the count entirely) so the description remains accurate without
requiring updates when tests change; search for the literal "~50 test functions"
in CLAUDE.md and update the sentence accordingly.

Comment on lines 109 to +117

### RPC Patterns
- Extensions can call host methods recursively (host→extension→host)
- Shared singletons work transparently via RPC proxying
- Context tracking prevents circular calls
| Variable | Purpose |
|----------|---------|
| `PYISOLATE_CHILD` | `"1"` in child processes |
| `PYISOLATE_DEBUG_RPC` | `"1"` for verbose RPC logging |
| `PYISOLATE_TRACE_FILE` | Path for structured perf trace JSONL output |
| `PYISOLATE_ENABLE_CUDA_IPC` | `"1"` to enable CUDA IPC tensor transport |
| `PYISOLATE_PATH_DEBUG` | `"1"` for sys.path logging during child init |
| `PYISOLATE_ENFORCE_SANDBOX` | Force bwrap sandboxing |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Description: Search for environment variable references in the codebase

for var in PYISOLATE_CHILD PYISOLATE_DEBUG_RPC PYISOLATE_TRACE_FILE PYISOLATE_ENABLE_CUDA_IPC PYISOLATE_PATH_DEBUG PYISOLATE_ENFORCE_SANDBOX; do
  echo "=== Checking $var ==="
  rg -n "$var" pyisolate/ --type py | head -5
done

# Also search for any PYISOLATE_ variables not in the list
echo "=== Searching for undocumented PYISOLATE_ variables ==="
rg -n 'PYISOLATE_[A-Z_]+' pyisolate/ --type py -o | sort -u

Repository: Comfy-Org/pyisolate

Length of output: 4908


Update the environment variables table with missing variables and remove unused entries.

The documented variables are incomplete and contain inaccuracies. Two variables (PYISOLATE_PATH_DEBUG and PYISOLATE_ENFORCE_SANDBOX) are documented but not actually used in the codebase, while 11+ environment variables are actively referenced but missing from the table. This misdirects developers—it's not an airtight documentation if there are so many loose ends!

Add these actively-used variables to the table:

  • PYISOLATE_HOST_SNAPSHOT — Host Python snapshot for initialization
  • PYISOLATE_MODULE_PATH — Module search paths for child process
  • PYISOLATE_UV_CACHE_DIR — UV package manager cache directory
  • PYISOLATE_UDS_ADDRESS — Unix domain socket address for IPC
  • PYISOLATE_EXTENSION — Extension modules to load in child
  • PYISOLATE_IMPORT_TORCH — Control torch import behavior
  • PYISOLATE_FORCE_CUDA_IPC — Force CUDA IPC (distinct from enable flag)
  • PYISOLATE_PURGE_SENDER_SHM — Shared memory cleanup control
  • PYISOLATE_SIGNAL_CLEANUP — Signal-based cleanup flag
  • PYISOLATE_CPU_BORROWED_SHM — CPU tensor shared memory management
  • PYISOLATE_CPU_TENSOR_FORCE_CLONE_ON_DESERIALIZE — CPU tensor deserialization behavior

Remove PYISOLATE_PATH_DEBUG and PYISOLATE_ENFORCE_SANDBOX if they are not production features or move them to a separate "legacy/experimental" section if applicable.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` around lines 109 - 117, Update the environment variables table to
reflect the actual, actively-used variables: add entries for
PYISOLATE_HOST_SNAPSHOT, PYISOLATE_MODULE_PATH, PYISOLATE_UV_CACHE_DIR,
PYISOLATE_UDS_ADDRESS, PYISOLATE_EXTENSION, PYISOLATE_IMPORT_TORCH,
PYISOLATE_FORCE_CUDA_IPC, PYISOLATE_PURGE_SENDER_SHM, PYISOLATE_SIGNAL_CLEANUP,
PYISOLATE_CPU_BORROWED_SHM, and PYISOLATE_CPU_TENSOR_FORCE_CLONE_ON_DESERIALIZE
with concise purpose descriptions matching their usage in the codebase (e.g.,
host snapshot, module search paths, UV cache dir, UDS address, extension
loading, torch import control, CUDA IPC forcing, SHM purge control, signal
cleanup, CPU SHM management, and CPU tensor clone-on-deserialize behavior);
remove PYISOLATE_PATH_DEBUG and PYISOLATE_ENFORCE_SANDBOX from the main table
(or move them to a clearly labeled legacy/experimental section) so the
documentation matches actual implementation.

Comment on lines +221 to 230
The primary transport is `JSONSocketTransport` — length-prefixed JSON over Unix Domain Sockets (or TCP on Windows). All isolation modes use this transport.

### QueueTransport
### JSONSocketTransport (Primary)

Uses `multiprocessing.Queue` for communication. Used when subprocess isolation is via `multiprocessing.Process`.
Uses length-prefixed JSON-RPC over raw sockets. No pickle. This is the standard transport for all Linux isolation modes (sandboxed and non-sandboxed) and Windows TCP mode.

### UDSTransport
### QueueTransport (Legacy)

Uses Unix Domain Sockets for communication. Used when subprocess isolation is via `bubblewrap` sandbox.
Uses `multiprocessing.Queue` for communication. This is a legacy backward-compatibility path and is not used in current isolation modes.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Align the Overview wording with this new transport stance.

This section correctly defines JSONSocketTransport as primary, but Line 7 still presents queues as a co-equal default path. Please update the Overview to mark queues as legacy too, so the docs don’t “queue” mixed signals.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/rpc_protocol.md` around lines 221 - 230, Update the Overview to state
that JSONSocketTransport is the primary/default transport and that
QueueTransport is a legacy/backward-compatibility path only (not a co-equal
default); specifically modify the Overview wording to mirror the
JSONSocketTransport/QueueTransport descriptions so QueueTransport is explicitly
labeled "legacy" and not presented as an alternative default, referencing the
transport names JSONSocketTransport and QueueTransport and ensuring the overview
language aligns with the detailed sections.

Comment on lines +70 to +75
tmp = tempfile.mkdtemp(prefix="bwrap_torch_share_")
venv_root = os.path.join(tmp, "venvs")
os.makedirs(venv_root, exist_ok=True)
shared_tmp = os.path.join(tmp, "ipc_shared")
os.makedirs(shared_tmp, exist_ok=True)
os.environ["TMPDIR"] = shared_tmp
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Temporary directory is never cleaned up — potential resource leak.

The tempfile.mkdtemp() creates a directory at Line 70, but it's never removed in the finally block. Each run leaves orphaned directories in the system temp location.

🧹 Proposed fix to clean up temp directory
+import shutil
+
 async def main() -> int:
     # ... existing code ...
     finally:
         if ext is not None:
             with contextlib.suppress(Exception):
                 ext.stop()
         AdapterRegistry.unregister()
+        with contextlib.suppress(Exception):
+            shutil.rmtree(tmp)

Add the import at the top and the cleanup in the finally block.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@example/bwrap_torch_share/host.py` around lines 70 - 75, The temp dir created
by tempfile.mkdtemp (assigned to tmp) is never removed, causing resource leaks;
update the finally block to remove tmp and its contents (including venv_root and
shared_tmp) using shutil.rmtree(tmp, ignore_errors=True) after restoring any
modified environment variables (TMPDIR) to ensure cleanup; reference the
tempfile.mkdtemp call that sets tmp, the venv_root and shared_tmp paths, and the
environment key "TMPDIR" so you remove the entire temporary tree and avoid
leaving orphaned directories.

Comment on lines +126 to 132
### Execution Models

A complete pyisolate application requires a special `main.py` entry point to handle virtual environment activation:
pyisolate provides two execution models:

```python
# main.py
if __name__ == "__main__":
# When running as the main script, import and run your host application
from host import main
main()
else:
# When imported by extension processes, ensure venv is properly activated
import os
import site
import sys

if os.name == "nt": # Windows-specific venv activation
venv = os.environ.get("VIRTUAL_ENV", "")
if venv != "":
sys.path.insert(0, os.path.join(venv, "Lib", "site-packages"))
site.addsitedir(os.path.join(venv, "Lib", "site-packages"))
```
- **`host-coupled`** (default): Child process shares the host's torch runtime and can use zero-copy tensor transfer via `/dev/shm` and CUDA IPC.
- **`sealed_worker`**: Fully isolated child with its own interpreter. No host `sys.path` reconstruction, JSON-RPC tensor transport only.

Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Execution model name likely mismatches accepted API value.

This section uses host-coupled, while ExtensionConfig docs indicate execution_model values are "subprocess" or "sealed_worker" (pyisolate/config.py, Lines 1-80). Please use the actual accepted value here to avoid copy-paste failures.

Suggested README fix
-- **`host-coupled`** (default): Child process shares the host's torch runtime and can use zero-copy tensor transfer via `/dev/shm` and CUDA IPC.
+- **`subprocess`** (default): Child process shares the host's torch runtime and can use zero-copy tensor transfer via `/dev/shm` and CUDA IPC.
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
### Execution Models
A complete pyisolate application requires a special `main.py` entry point to handle virtual environment activation:
pyisolate provides two execution models:
```python
# main.py
if __name__ == "__main__":
# When running as the main script, import and run your host application
from host import main
main()
else:
# When imported by extension processes, ensure venv is properly activated
import os
import site
import sys
if os.name == "nt": # Windows-specific venv activation
venv = os.environ.get("VIRTUAL_ENV", "")
if venv != "":
sys.path.insert(0, os.path.join(venv, "Lib", "site-packages"))
site.addsitedir(os.path.join(venv, "Lib", "site-packages"))
```
- **`host-coupled`** (default): Child process shares the host's torch runtime and can use zero-copy tensor transfer via `/dev/shm` and CUDA IPC.
- **`sealed_worker`**: Fully isolated child with its own interpreter. No host `sys.path` reconstruction, JSON-RPC tensor transport only.
### Execution Models
pyisolate provides two execution models:
- **`subprocess`** (default): Child process shares the host's torch runtime and can use zero-copy tensor transfer via `/dev/shm` and CUDA IPC.
- **`sealed_worker`**: Fully isolated child with its own interpreter. No host `sys.path` reconstruction, JSON-RPC tensor transport only.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 126 - 132, The README currently documents an
execution model named "host-coupled" which doesn't match the allowed values in
ExtensionConfig; update the text to use the actual accepted value "subprocess"
(the config field is execution_model on ExtensionConfig) and ensure the two
bullets read "subprocess" (default): Child process shares the host's torch
runtime... and "sealed_worker": Fully isolated child..., so examples and docs
match the accepted API strings.

## Use Cases
### Core
- Automatic virtual environment management
- Bidirectional JSON-RPC over Unix Domain Sockets (no pickle)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Transport feature line is too Linux-specific for current behavior.

Bidirectional JSON-RPC over Unix Domain Sockets omits Windows TCP mode, while transport docs state UDS or TCP on Windows. Small wording tweak keeps expectations in tune.

Suggested README wording tweak
-- Bidirectional JSON-RPC over Unix Domain Sockets (no pickle)
+- Bidirectional JSON-RPC over Unix Domain Sockets (or TCP on Windows), no pickle
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
- Bidirectional JSON-RPC over Unix Domain Sockets (no pickle)
- Bidirectional JSON-RPC over Unix Domain Sockets (or TCP on Windows), no pickle
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` at line 199, The transport feature line in README.md is too
Linux-specific; update the line that currently reads "Bidirectional JSON-RPC
over Unix Domain Sockets" to mention the Windows TCP fallback so it matches the
transport docs (e.g., reference the phrase "Bidirectional JSON-RPC over Unix
Domain Sockets" and change it to include "or TCP on Windows" / "UDS or TCP on
Windows"). Locate and edit that exact feature line in README.md so the wording
aligns with the transport documentation.

Comment on lines +18 to +23
def _run(coro):
loop = asyncio.new_event_loop()
try:
return loop.run_until_complete(coro)
finally:
loop.close()
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Consider using asyncio.run() instead of manual loop management.

The _run() helper manually creates/closes an event loop. Since Python 3.7+, asyncio.run() handles this cleanly and is the recommended approach. However, if there's a specific reason for the manual approach (e.g., compatibility with the test harness), it's fine to keep.

♻️ Simpler alternative using asyncio.run()
 def _run(coro):
-    loop = asyncio.new_event_loop()
-    try:
-        return loop.run_until_complete(coro)
-    finally:
-        loop.close()
+    return asyncio.run(coro)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
def _run(coro):
loop = asyncio.new_event_loop()
try:
return loop.run_until_complete(coro)
finally:
loop.close()
def _run(coro):
return asyncio.run(coro)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/integration_v2/test_rpc_coverage.py` around lines 18 - 23, The helper
_run currently creates and closes an event loop manually; replace its body to
call asyncio.run(coro) so the function delegates lifecycle management to
asyncio.run (Python 3.7+)—update the _run function to return asyncio.run(coro)
and remove manual loop creation/close while keeping the same function name to
preserve call sites.

name="rpc_cov_sealed",
module_path=package_path,
isolated=True,
dependencies=[str(pyisolate_root)],
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Inconsistent dependency format compared to other examples.

Line 77 uses str(pyisolate_root) (bare path), but example/sealed_worker/host.py Line 74 uses f"-e {pyisolate_root}" (editable install). This inconsistency could cause different installation behaviors.

If editable mode is intentional for the example but not required for tests, this is fine — but worth documenting or aligning.

♻️ Consider using consistent format
         config = ExtensionConfig(
             name="rpc_cov_sealed",
             module_path=package_path,
             isolated=True,
-            dependencies=[str(pyisolate_root)],
+            dependencies=[f"-e {pyisolate_root}"],
             apis=[],
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/integration_v2/test_rpc_coverage.py` at line 77, The dependency entry
uses str(pyisolate_root) which is inconsistent with the editable install format
used elsewhere (e.g., f"-e {pyisolate_root}"); update the dependencies list in
the test (the dependencies=[...] usage referencing pyisolate_root) to use the
same format as examples by replacing str(pyisolate_root) with f"-e
{pyisolate_root}" (or alternatively add a short comment explaining why a
non-editable install is intentionally used if you choose to keep
str(pyisolate_root)).

Comment on lines +90 to +117
wheel_index = "https://pollockjj.github.io/wheels/"
monkeypatch.setenv("UV_EXTRA_INDEX_URL", wheel_index)
print(f"UV_EXTRA_INDEX_URL={wheel_index}")
try:
ext.ensure_process_started()
except RuntimeError as exc:
if "bubblewrap" in str(exc).lower():
pytest.skip(f"bwrap unavailable on this platform: {exc}")
raise
# Verify pyisolate was installed in the child venv from the published index
if os.name == "nt":
child_python = Path(ext.venv_path) / "Scripts" / "python.exe"
else:
child_python = Path(ext.venv_path) / "bin" / "python3"
pip_show = subprocess.run(
[str(child_python), "-m", "pip", "show", "pyisolate"],
capture_output=True,
text=True,
check=False,
)
print(f"child venv pyisolate install:\n{pip_show.stdout}")
assert "pyisolate" in pip_show.stdout, "pyisolate not installed in child venv"
from importlib.metadata import version as _meta_version

expected_version = _meta_version("pyisolate")
assert expected_version in pip_show.stdout, (
f"pyisolate version mismatch: expected {expected_version} in child venv"
)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Version verification logic looks good, but consider edge cases.

The verification that pyisolate is installed in the child venv from the published index is a sensible addition. However, the substring check on Line 115 (expected_version in pip_show.stdout) could match partial versions (e.g., 0.10.1 would match 0.10.10). This is unlikely to be an issue in practice, but a more precise check would be safer.

Also, the pip_show.returncode isn't explicitly verified before checking stdout — if pip fails entirely, you'd get a potentially confusing assertion failure rather than a clear "pip show failed" message.

🔍 Proposed refinement for more precise version matching
             pip_show = subprocess.run(
                 [str(child_python), "-m", "pip", "show", "pyisolate"],
                 capture_output=True,
                 text=True,
                 check=False,
             )
             print(f"child venv pyisolate install:\n{pip_show.stdout}")
+            assert pip_show.returncode == 0, f"pip show failed: {pip_show.stderr}"
             assert "pyisolate" in pip_show.stdout, "pyisolate not installed in child venv"
             from importlib.metadata import version as _meta_version

             expected_version = _meta_version("pyisolate")
-            assert expected_version in pip_show.stdout, (
-                f"pyisolate version mismatch: expected {expected_version} in child venv"
-            )
+            assert f"Version: {expected_version}" in pip_show.stdout, (
+                f"pyisolate version mismatch: expected {expected_version} in child venv"
+            )
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_uv_sealed_integration.py` around lines 90 - 117, The test should
first assert that the pip_show subprocess succeeded by checking
pip_show.returncode and, on failure, raise/pytest.fail with pip_show.stderr to
make failures explicit; then parse pip_show.stdout for the exact "Version:
<value>" line (e.g. find the line starting with "Version:" or use a regex to
capture the version token) and compare that captured version string for equality
with expected_version instead of using a substring containment check; update
references in the test to use pip_show, expected_version, and child_python when
locating the change.

@pollockjj pollockjj merged commit 1082e78 into Comfy-Org:main Apr 5, 2026
43 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants