Skip to content

pyisolate 0.10.1#10

Closed
pollockjj wants to merge 7 commits intoComfy-Org:mainfrom
pollockjj:develop
Closed

pyisolate 0.10.1#10
pollockjj wants to merge 7 commits intoComfy-Org:mainfrom
pollockjj:develop

Conversation

@pollockjj
Copy link
Copy Markdown
Collaborator

pyisolate 0.10.1

Bugfix release restoring degraded functionality.

Changes

  1. docs/infra cleanup — Remove Sphinx build infrastructure, move benchmarks to benchmarks/, replace YAML manifests with inline ExtensionConfig, rewrite README/CLAUDE
  2. examples restoration — Three standalone examples (torch_share, sealed_worker, bwrap_torch_share) with integration tests replacing the stale single-example directory
  3. RPC shutdown fix — Graceful shutdown with socket.SHUT_RDWR, sealed_worker auto-inject
  4. CI fix — Update pytorch.yml to reference new example path
  5. Test infrastructure — Sealed worker test uses pollockjj.github.io/wheels/ index for pyisolate resolution instead of PyPI
  6. Version bump — 0.10.0 → 0.10.1

…down fix, sealed_worker test infrastructure

* chore: docs and infrastructure cleanup

Remove Sphinx build infrastructure, move benchmark scripts into
benchmarks/ directory, replace stale YAML manifests with inline
ExtensionConfig in examples, rewrite CLAUDE.md and README.md to
remove application-specific references and clean library comments.

* feat: restore working examples with integration tests

Replace stale single-example directory with three standalone examples
(torch_share, sealed_worker, bwrap_torch_share), each with its own
host.py and extension module. Add integration tests covering RPC
lifecycle for all three isolation modes.

* fix: graceful RPC shutdown with socket SHUT_RDWR and sealed_worker auto-inject

Fix RPC shutdown sequence to use socket.SHUT_RDWR before close,
preventing lingering connections. Auto-inject sealed_worker bootstrap
path when execution_model is sealed_worker.

* chore: bump version to 0.10.1

* chore: bump version to 0.10.1

* fix: sealed_worker test uses pollockjj/wheels index instead of PyPI for pyisolate

* chore: ruff format test_uv_sealed_integration.py

* fix: update pytorch CI workflow to use torch_share example (main.py was removed)

* fix: Windows-compatible child venv python path in sealed_worker test
Copilot AI review requested due to automatic review settings April 5, 2026 03:02
@coderabbitai
Copy link
Copy Markdown

coderabbitai bot commented Apr 5, 2026

📝 Walkthrough

Walkthrough

Docs and Sphinx infra removed or consolidated; legacy multi-extension examples deleted; focused examples added for torch_share, sealed_worker, and bwrap; RPC shutdown and socket close made more deterministic; sealed_worker installs pinned to published pyisolate version; tests updated; version bumped to 0.10.1. Ping, pong — rhyme and song.

Changes

Cohort / File(s) Summary
Documentation removal & consolidation
docs/Makefile, docs/conf.py, docs/api.rst, docs/index.rst, README_COMFYUI.md
Removed Sphinx config, API page, master TOC, and ComfyUI guide; docs build targets and many pages deleted.
Docs edits & README
README.md, CLAUDE.md, docs/debugging.md, docs/edge_cases.md, docs/rpc_protocol.md
Condensed README, raised Python min to 3.10, adjusted env var list and transport/IPC docs; TensorKeeper default retention changed 30s → 5s; JSONSocketTransport documented as primary; other doc text revised.
Legacy example removal
example/shared/__init__.py, example/host.py, example/main.py, example/extensions/...
Deleted legacy multi-extension demo, DatabaseSingleton, ExampleExtensionBase, three example extensions, and their manifests.
New focused examples & hosts
example/torch_share/..., example/sealed_worker/..., example/bwrap_torch_share/host.py
Added torch_share and sealed_worker extension modules plus host scripts; added bwrap sandboxed host script. Hosts start extensions, perform proxy ping, and enforce cleanup.
CI/workflows adjustments
.github/workflows/pytorch.yml, .github/workflows/windows.yml, .github/workflows/docs.yml
PyTorch and Windows test steps now cd into example/torch_share and run python host.py; docs workflow removed.
Tests & integration
tests/integration_v2/test_rpc_coverage.py, tests/test_uv_sealed_integration.py
Added integration tests for RPC round-trips (torch_share, sealed_worker, bwrap); uv sealed test now enables network, checks child venv has pyisolate and verifies installed version.
Core package & version
pyisolate/__init__.py, pyproject.toml, tox.ini
Bumped project version 0.10.0 → 0.10.1; pyproject version updated; tox envlist removed Python 3.9 (now 3.10–3.12).
RPC & transport hardening
pyisolate/_internal/rpc_protocol.py, pyisolate/_internal/rpc_transports.py
AsyncRPC.shutdown() enqueues sentinel, closes transport, resolves blocking futures, and joins worker threads; recv thread now resolves blocking future on recv errors; JSONSocketTransport.close() attempts socket.shutdown() before close.
Environment install behavior
pyisolate/_internal/environment.py
For execution_model == "sealed_worker", safe dependencies list now includes pinned pyisolate==<installed-version> instead of a local repo path.
Examples removed: extension modules & manifests
example/extensions/extension1/..., example/extensions/extension2/..., example/extensions/extension3/...
Removed example extension implementations and their manifest.yaml files.
Example additions: sealed_worker extension
example/sealed_worker/extension/__init__.py, example/sealed_worker/host.py
Added SealedWorkerExtension module and host script implementing MinimalAdapter and execution_model="sealed_worker" host flow.
Example additions: torch_share extension & host
example/torch_share/extension/__init__.py, example/torch_share/host.py
Added TorchShareExtension and host script demonstrating share_torch=True runs, ping probe, and shared IPC TMPDIR usage.
Example addition: bwrap host
example/bwrap_torch_share/host.py
Added bubblewrap sandboxed host that runs TorchShareExtension under SandboxMode.REQUIRED with TMPDIR-based IPC.
Minor doc/comment generalizations
pyisolate/_internal/client.py, pyisolate/_internal/host.py, pyisolate/_internal/sandbox.py, pyisolate/interfaces.py, pyisolate/sealed.py
Reworded docstrings/comments to generalize away from ComfyUI-specific examples; no behavioral/API changes.
Benchmarks README
benchmarks/README.md
Added runner scripts section listing platform-specific benchmark helpers.
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
✨ Simplify code
  • Create PR with simplified code

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@codecov
Copy link
Copy Markdown

codecov bot commented Apr 5, 2026

Codecov Report

❌ Patch coverage is 84.21053% with 3 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
pyisolate/_internal/rpc_protocol.py 78.57% 2 Missing and 1 partial ⚠️
Flag Coverage Δ
unittests 63.69% <84.21%> (+3.14%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Files with missing lines Coverage Δ
pyisolate/__init__.py 73.33% <100.00%> (+1.90%) ⬆️
pyisolate/_internal/client.py 80.95% <ø> (-2.66%) ⬇️
pyisolate/_internal/environment.py 68.87% <100.00%> (+2.20%) ⬆️
pyisolate/_internal/host.py 69.66% <ø> (+9.04%) ⬆️
pyisolate/_internal/rpc_transports.py 32.71% <100.00%> (-0.76%) ⬇️
pyisolate/_internal/sandbox.py 72.87% <ø> (+10.75%) ⬆️
pyisolate/interfaces.py 100.00% <ø> (ø)
pyisolate/sealed.py 39.83% <ø> (ø)
pyisolate/_internal/rpc_protocol.py 57.06% <78.57%> (+1.36%) ⬆️

... and 4 files with indirect coverage changes

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR is a pyisolate 0.10.1 bugfix release focused on restoring sealed-worker functionality, improving RPC shutdown behavior, and restructuring docs/examples/bench tooling to match the updated architecture and CI.

Changes:

  • Fix RPC shutdown/unblocking behavior (socket shutdown + thread unblocking/join).
  • Update sealed-worker dependency installation and add/adjust integration tests for RPC coverage and sealed uv runtime behavior.
  • Restructure docs/examples/benchmarks (remove Sphinx infra, replace old example tree with standalone examples, add benchmark runner scripts) and bump version to 0.10.1.

Reviewed changes

Copilot reviewed 39 out of 44 changed files in this pull request and generated 7 comments.

Show a summary per file
File Description
tox.ini Drops Python 3.9 from tox envlist to match supported versions.
tests/test_uv_sealed_integration.py Adds UV wheel index usage and asserts child venv has pyisolate installed.
tests/integration_v2/test_rpc_coverage.py New integration tests exercising RPC lifecycle paths via real extensions.
README.md Major README rewrite to reflect current isolation models/backends and features.
README_COMFYUI.md Removes ComfyUI-specific README content.
pyproject.toml Bumps project version to 0.10.1.
pyisolate/sealed.py Generalizes docstring language away from ComfyUI-specific wording.
pyisolate/interfaces.py Generalizes adapter docs; updates example identifier text.
pyisolate/_internal/sandbox.py Generalizes comments away from ComfyUI-specific references.
pyisolate/_internal/rpc_transports.py Improves socket transport close semantics with shutdown(SHUT_RDWR).
pyisolate/_internal/rpc_protocol.py Implements more robust shutdown/unblock logic and thread joining; unblocks run_until_stopped on connection loss.
pyisolate/_internal/host.py Generalizes comments and clarifies GPU default behavior comment.
pyisolate/_internal/environment.py Changes sealed-worker dependency injection to install pyisolate by version.
pyisolate/_internal/client.py Generalizes comment away from ComfyUI-specific reference.
pyisolate/init.py Bumps __version__ constant to 0.10.1.
example/torch_share/host.py Adds standalone torch_share example host script.
example/torch_share/extension/init.py Adds torch_share example extension module.
example/shared/init.py Removes old shared example helpers.
example/sealed_worker/host.py Adds standalone sealed_worker example host script.
example/sealed_worker/extension/init.py Adds sealed_worker example extension module.
example/bwrap_torch_share/host.py Adds Linux-only bwrap + torch_share standalone example host script.
example/main.py Removes old example entrypoint shim.
example/host.py Removes old multi-extension example runner.
example/extensions/extension1/manifest.yaml Removes old extension manifest.
example/extensions/extension1/init.py Removes old extension implementation.
example/extensions/extension2/manifest.yaml Removes old extension manifest.
example/extensions/extension2/init.py Removes old extension implementation.
example/extensions/extension3/manifest.yaml Removes old extension manifest.
example/extensions/extension3/init.py Removes old extension implementation.
docs/rpc_protocol.md Updates transport documentation to reflect current primary transport usage.
docs/edge_cases.md Updates TensorKeeper retention default described in docs.
docs/debugging.md Updates TensorKeeper retention default described in docs.
docs/Makefile Removes Sphinx build Makefile.
docs/index.rst Removes Sphinx index page.
docs/conf.py Removes Sphinx configuration.
docs/api.rst Removes Sphinx autodoc API page.
CLAUDE.md Updates maintenance guidance and repo structure notes.
benchmarks/run_benchmarks_windows.ps1 Adds PowerShell benchmark runner for Windows.
benchmarks/run_benchmarks_windows.bat Adds CMD benchmark runner for Windows.
benchmarks/run_benchmarks_powershell_launcher.bat Adds launcher to handle PowerShell execution policy.
benchmarks/run_benchmarks_linux.sh Adds Linux/macOS benchmark runner script.
benchmarks/README.md Documents added benchmark runner scripts.
benchmarks/INSTRUCTIONS.md Adds end-user instructions for running benchmarks.
.github/workflows/pytorch.yml Updates CI example path/entrypoint to the new torch_share example host script.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 380 to 388
safe_deps: list[str] = []
if config.get("execution_model") == "sealed_worker":
safe_deps.append(str(Path(__file__).resolve().parents[2]))
from importlib.metadata import version as _pkg_version

_pyisolate_ver = _pkg_version("pyisolate")
safe_deps.append(f"pyisolate=={_pyisolate_ver}")
for dep in config["dependencies"]:
validate_dependency(dep)
safe_deps.append(dep)
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In sealed_worker mode this auto-injects pyisolate==<host version> into the dependency list. If the extension config already includes a local/editable pyisolate install (e.g. -e /path/to/repo or a repo path), uv pip install will receive two specs for the same distribution and can fail with a resolution/conflict error. Consider only injecting pyisolate when the dependency list does not already reference pyisolate (or filter/replace any existing pyisolate spec).

Copilot uses AI. Check for mistakes.
Comment on lines +73 to +84
config = ExtensionConfig(
name="rpc_cov_sealed",
module_path=package_path,
isolated=True,
dependencies=[str(pyisolate_root)],
apis=[],
share_torch=False,
share_cuda_ipc=False,
execution_model="sealed_worker",
sandbox_mode=SandboxMode.DISABLED,
env={"PYISOLATE_SIGNAL_CLEANUP": "1"},
)
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This sealed_worker test sets dependencies=[str(pyisolate_root)] (a local pyisolate source checkout). With the new sealed_worker auto-injection of pyisolate==<version> in environment.install_dependencies, the child venv install step may see two pyisolate requirements (pinned wheel + local path) and fail. Either remove this dependency (rely on the auto-injected pyisolate) or update the injection logic to avoid duplicates/conflicts.

Copilot uses AI. Check for mistakes.
Comment on lines +98 to +111
# Verify pyisolate was installed in the child venv from the published index
if os.name == "nt":
child_python = Path(ext.venv_path) / "Scripts" / "python.exe"
else:
child_python = Path(ext.venv_path) / "bin" / "python3"
pip_show = subprocess.run(
[str(child_python), "-m", "pip", "show", "pyisolate"],
capture_output=True,
text=True,
check=False,
)
print(f"child venv pyisolate install:\n{pip_show.stdout}")
assert "pyisolate" in pip_show.stdout, "pyisolate not installed in child venv"
assert "0.10.1" in pip_show.stdout, "pyisolate version mismatch in child venv"
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The test asserts a hard-coded pyisolate version string ("0.10.1") from pip show. This will require editing the test on every version bump and can break when backporting. Prefer comparing against the host package version (e.g., pyisolate.__version__ or importlib.metadata.version("pyisolate")) and asserting against the expected Version: field.

Copilot uses AI. Check for mistakes.
Comment on lines +56 to +60
async def main() -> int:
pyisolate_root = str(repo_root)
example_dir = Path(__file__).resolve().parent
extension_path = str(example_dir / "extension")

Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

extension_path is computed but unused, and this host script loads tests/harness/test_package instead of the example extension under example/torch_share/extension. As a result, running the example doesn’t actually exercise the example extension code in this directory. Either wire the host to load the local example extension (and drop the harness dependency) or remove the unused example extension directory/variable to avoid confusion.

Copilot uses AI. Check for mistakes.
Comment on lines +66 to +72
pyisolate_root = Path(__file__).resolve().parents[2]
package_path = Path(ReferenceTestExtension.__module__.replace(".", "/")).resolve()
# Use the test_package path from the harness
import tests.harness.test_package as tp

package_path = str(Path(tp.__file__).parent.resolve())

Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This block computes package_path from ReferenceTestExtension.__module__, but then immediately overwrites it with the correct filesystem path from tp.__file__. The initial computation is dead code and makes the test harder to follow; remove it and keep a single package_path derivation.

Copilot uses AI. Check for mistakes.
Comment on lines 85 to 93
uv_path = f"{UV_BIN.parent}{os.pathsep}{path_env}" if UV_BIN.exists() else path_env
with pytest.MonkeyPatch.context() as monkeypatch:
monkeypatch.setenv("PATH", uv_path)
monkeypatch.setenv("PYISOLATE_ARTIFACT_DIR", str(run_dir / "artifacts"))
wheel_index = "https://pollockjj.github.io/wheels/"
monkeypatch.setenv("UV_EXTRA_INDEX_URL", wheel_index)
print(f"UV_EXTRA_INDEX_URL={wheel_index}")
try:
ext.ensure_process_started()
Copy link

Copilot AI Apr 5, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This test now requires external network access to fetch wheels from https://pollockjj.github.io/wheels/, but it is not marked with @pytest.mark.network. Since pytest is configured with -m 'not network', unmarked network tests will run by default and fail in offline/locked-down environments. Add the network marker (or otherwise avoid external fetches).

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 18

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
tox.ini (1)

25-31: ⚠️ Potential issue | 🟡 Minor

Stale [testenv:docs] section references removed Sphinx infrastructure.

The PR objectives state Sphinx build infrastructure was removed, yet this testenv:docs section still configures Sphinx dependencies and commands. This section appears to be dead code now — a sphinx of its former self! 🦁

Consider removing or updating this section to match the current docs workflow.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tox.ini` around lines 25 - 31, The [testenv:docs] tox section is stale and
still declares Sphinx deps and a sphinx-build command; remove the entire
[testenv:docs] block (including deps, changedir, and commands entries) or update
it to reflect the current docs workflow by replacing sphinx references
(sphinx>=5.0, sphinx-rtd-theme>=1.0 and the sphinx-build command) with the new
documentation steps; locate the block by the [testenv:docs] header and modify or
delete it accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/pytorch.yml:
- Around line 108-109: The CUDA job step is invoking "python host.py -v" but the
"-v" flag is ignored; update the workflow step that runs "cd
example/torch_share" and "python host.py -v" to remove the ineffective "-v" flag
(i.e., run "python host.py" or replace with the proper verbosity option expected
by host.py), ensuring the change is made for the CUDA job step that references
example/torch_share and host.py.
- Around line 49-50: The workflow passes a -v flag to host.py but host.py
currently ignores CLI args and calls logging.basicConfig(level=logging.INFO)
unconditionally; either remove the -v from the workflow command ("python host.py
-v") or make host.py honor -v by adding simple argument parsing (e.g., argparse
in host.py) that recognizes -v/--verbose and sets
logging.basicConfig(level=logging.DEBUG) (or INFO when not verbose) before the
app starts; update the main entry (e.g., main() or the top-level script in
host.py) to parse args and configure logging accordingly so the flag has effect.

In `@CLAUDE.md`:
- Around line 57-65: Update the bootstrap command that currently runs uv pip
install -e ".[dev]" so it also installs the test extras required for running
pytest on a clean checkout (i.e. include the "test" extra alongside "dev");
locate the command string uv pip install -e ".[dev]" in CLAUDE.md and change it
to include the test extra (so pytest and dependencies like torch, numpy, psutil,
tabulate are installed) while keeping the surrounding environment setup steps
intact.
- Line 7: Update the version string in the maintainer guide text that mentions
pyisolate so it matches the rest of the release: change the occurrence of
"v0.10.0" (in the sentence starting with "**pyisolate** is a Python library
(PyPI: `pyisolate`, v0.10.0)") to "v0.10.1" so the guide is consistent with the
release notes.

In `@docs/rpc_protocol.md`:
- Around line 221-229: Update the docs to reflect that QueueTransport is still
used: state that JSONSocketTransport is the primary transport for sandboxed and
non-sandboxed Linux modes (UDS) and Windows TCP, and explicitly note that
QueueTransport is used for host-coupled isolation mode when
recv_queue/send_queue are provided (as instantiated in rpc_protocol.py), and
remove the wording that QueueTransport is unused/legacy or add a clear
deprecation note only if you plan to deprecate the QueueTransport class in
rpc_transports.py; ensure you reference the transport class names
JSONSocketTransport and QueueTransport and mention the host-coupled mode /
recv_queue/send_queue trigger so readers can find the relevant code.

In `@example/bwrap_torch_share/host.py`:
- Around line 26-27: Replace the external test dependency by moving
ReferenceTestExtension into example/bwrap_torch_share and update the import to
use the local class; modify the extension’s API (or its ping()) to perform an
actual torch tensor round-trip: create a torch.Tensor in the host, send it
through the extension when launching the burrow with share_torch=True, receive
the tensor back on the host and assert equality (shape, dtype, and values)
instead of only checking pong/ping success; ensure the example constructs and
compares the tensor (e.g., via torch.equal or allclose) and fails the script on
mismatch so the example is self-contained and verifies real tensor sharing.

In `@example/sealed_worker/extension/__init__.py`:
- Around line 1-36: The example extension SealedWorkerExtension (symbols:
SealedWorkerExtension, extension_entrypoint, ping, check_torch_absent) is never
used by the companion host which loads ReferenceTestExtension, causing
confusing/contradictory behavior; fix by either (A) updating the host to import
this module's extension_entrypoint and instantiate SealedWorkerExtension so the
child actually runs this extension (and adjust host expectations to accept
"pong_sealed" from ping or change ping to return "pong"), or (B) remove the
unused extension module/directory to avoid confusion; locate these symbols in
the extension's __init__.py and the host code that currently references
ReferenceTestExtension to implement one of the two options.

In `@example/sealed_worker/host.py`:
- Around line 70-90: The host currently sets test_pkg_path and constructs
Extension with module_path=test_pkg_path and
extension_type=ReferenceTestExtension, loading the tests/harness test package
instead of the local example; update the setup to point module_path at the local
extension folder (e.g., Path(repo_root) / "extension") and change extension_type
from ReferenceTestExtension to SealedWorkerExtension (or import the
SealedWorkerExtension symbol) so the example loads its own extension;
alternatively, if keeping the harness is intentional, add a comment or README
note near the ExtensionConfig/Extension construction explaining why
tests/harness/test_package is used for CI rather than the local extension.
- Line 54: Remove the unused assignment extension_path =
str(Path(__file__).resolve().parent / "extension") from host.py since the code
actually uses test_pkg_path (tests/harness/test_package); either delete the
extension_path line or replace usages to point to extension_path where you truly
intend to load the local extension directory, and ensure no other code
references extension_path (search for the symbol to confirm).

In `@example/torch_share/extension/__init__.py`:
- Around line 1-33: The TorchShareExtension is orphaned and its compute() will
never be used because the host loads ReferenceTestExtension; either wire this
extension into the host or remove it: either update the host to import and call
extension_entrypoint() from this module (so TorchShareExtension.ping() returns
"pong_torch_share" and compute() is exercised), or delete this file and any
references to torch_share to avoid confusion; locate TorchShareExtension,
compute, and extension_entrypoint in the diff and make the corresponding change
so the host and extension match.

In `@example/torch_share/host.py`:
- Line 22: The import detect_sandbox_capability from
pyisolate._internal.sandbox_detect is unused; remove that import statement (or
if sandbox detection was intended, replace its unused import with actual usage
inside the module, e.g., call detect_sandbox_capability where relevant). Locate
the import line referencing detect_sandbox_capability and delete it to eliminate
the unused import.
- Around line 116-119: The finally block references ext which may be undefined
on early exceptions; initialize ext = None before the try block and in the
finally check for it explicitly (e.g., if ext is not None) before calling
ext.stop() inside contextlib.suppress(Exception); keep
AdapterRegistry.unregister() as-is. This mirrors the sealed_worker pattern:
declare ext = None, assign the real extension inside the try, and in finally
only attempt ext.stop() when ext is not None to avoid silently suppressing a
NameError.
- Line 59: The variable extension_path is computed from example_dir but never
used; either remove the unused assignment to extension_path or, if the extension
directory was intended to be used, update the code that should reference it
(e.g., the host initialization or file operations that should use
extension_path). Locate the assignment to extension_path and either delete that
line or replace the callers to use extension_path (ensuring example_dir and
extension_path semantics are preserved).

In `@pyisolate/_internal/environment.py`:
- Around line 381-385: The new block calling importlib.metadata.version as
_pkg_version and assigning _pyisolate_ver may raise PackageNotFoundError in
dev/source-checkout runs; change it to reuse the existing
_detect_pyisolate_version()/pyisolate_version logic or wrap the
_pkg_version("pyisolate") call in a try/except that falls back to the same
default ("0.0.0") and then append safe_deps with the resolved version (i.e., use
pyisolate_version or the try/except result when building the f"pyisolate=={...}"
string) so sealed_worker creation won't crash in non-installed environments.

In `@README.md`:
- Around line 160-180: The README example for MyAppAdapter subclasses
IsolationAdapter but omits required hooks, causing a misleading incomplete
example; update the MyAppAdapter example to either include the remaining adapter
hooks (implement setup_child_environment and handle_api_registration with brief
stub descriptions matching the MinimalAdapter pattern) or explicitly mark the
snippet as partial by adding a comment like "partial: additional hooks
(setup_child_environment, handle_api_registration) omitted" so readers know to
supply those methods when subclassing IsolationAdapter or consult
example/bwrap_torch_share/host.py's MinimalAdapter for full implementations.

In `@tests/integration_v2/test_rpc_coverage.py`:
- Around line 26-31: The host pytest fixture calls ReferenceHost.setup() before
yield but does not guard against exceptions, so ReferenceHost.cleanup() via
_run(h.cleanup()) may never execute; wrap setup/yield/teardown in a try/finally
so that if h.setup() or anything before yield raises you still call
_run(h.cleanup()) (i.e., call h = ReferenceHost(); try: h.setup(); yield h;
finally: _run(h.cleanup())), referencing the host fixture,
ReferenceHost.setup(), ReferenceHost.cleanup(), and _run to locate the code to
change.
- Around line 66-72: Remove the redundant package_path computation that uses
ReferenceTestExtension.__module__; keep the pyisolate_root assignment and the
single package_path assignment that imports tests.harness.test_package as tp and
sets package_path = str(Path(tp.__file__).parent.resolve()). In practice delete
or comment out the line "package_path =
Path(ReferenceTestExtension.__module__.replace('.', '/')).resolve()" so only the
import of tests.harness.test_package and the following package_path = str(...)
assignment remain.

In `@tests/test_uv_sealed_integration.py`:
- Around line 89-111: The test only checks pip show output (pip_show) which
doesn't prove the wheel came from the custom index (UV_EXTRA_INDEX_URL); update
the test around ext.ensure_process_started()/child_python/pip_show so it
explicitly verifies provenance by either (a) running the installer via the CLI
with verbose logging (invoke uv pip install pyisolate -v or equivalent through
ext or subprocess) and asserting the verbose output contains the custom index
URL, or (b) running the lock step (uv lock) after installation and inspecting
uv.lock for the package entry's source.registry field to assert it equals the
custom index; keep existing checks for version but add one of these provenance
assertions to ensure the custom index supplied the wheel.

---

Outside diff comments:
In `@tox.ini`:
- Around line 25-31: The [testenv:docs] tox section is stale and still declares
Sphinx deps and a sphinx-build command; remove the entire [testenv:docs] block
(including deps, changedir, and commands entries) or update it to reflect the
current docs workflow by replacing sphinx references (sphinx>=5.0,
sphinx-rtd-theme>=1.0 and the sphinx-build command) with the new documentation
steps; locate the block by the [testenv:docs] header and modify or delete it
accordingly.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 8ea554e3-4eb7-4db1-8558-c3bfa53a0a96

📥 Commits

Reviewing files that changed from the base of the PR and between 51bb318 and 780eab1.

📒 Files selected for processing (44)
  • .github/workflows/pytorch.yml
  • CLAUDE.md
  • README.md
  • README_COMFYUI.md
  • benchmarks/INSTRUCTIONS.md
  • benchmarks/README.md
  • benchmarks/run_benchmarks_linux.sh
  • benchmarks/run_benchmarks_powershell_launcher.bat
  • benchmarks/run_benchmarks_windows.bat
  • benchmarks/run_benchmarks_windows.ps1
  • docs/Makefile
  • docs/api.rst
  • docs/conf.py
  • docs/debugging.md
  • docs/edge_cases.md
  • docs/index.rst
  • docs/rpc_protocol.md
  • example/bwrap_torch_share/host.py
  • example/extensions/extension1/__init__.py
  • example/extensions/extension1/manifest.yaml
  • example/extensions/extension2/__init__.py
  • example/extensions/extension2/manifest.yaml
  • example/extensions/extension3/__init__.py
  • example/extensions/extension3/manifest.yaml
  • example/host.py
  • example/main.py
  • example/sealed_worker/extension/__init__.py
  • example/sealed_worker/host.py
  • example/shared/__init__.py
  • example/torch_share/extension/__init__.py
  • example/torch_share/host.py
  • pyisolate/__init__.py
  • pyisolate/_internal/client.py
  • pyisolate/_internal/environment.py
  • pyisolate/_internal/host.py
  • pyisolate/_internal/rpc_protocol.py
  • pyisolate/_internal/rpc_transports.py
  • pyisolate/_internal/sandbox.py
  • pyisolate/interfaces.py
  • pyisolate/sealed.py
  • pyproject.toml
  • tests/integration_v2/test_rpc_coverage.py
  • tests/test_uv_sealed_integration.py
  • tox.ini
💤 Files with no reviewable changes (14)
  • example/extensions/extension1/manifest.yaml
  • docs/index.rst
  • example/extensions/extension1/init.py
  • README_COMFYUI.md
  • example/main.py
  • docs/api.rst
  • example/extensions/extension2/manifest.yaml
  • docs/Makefile
  • example/extensions/extension2/init.py
  • docs/conf.py
  • example/host.py
  • example/extensions/extension3/manifest.yaml
  • example/shared/init.py
  • example/extensions/extension3/init.py

Comment on lines +221 to +229
The primary transport is `JSONSocketTransport` — length-prefixed JSON over Unix Domain Sockets (or TCP on Windows). All isolation modes use this transport.

### QueueTransport
### JSONSocketTransport (Primary)

Uses `multiprocessing.Queue` for communication. Used when subprocess isolation is via `multiprocessing.Process`.
Uses length-prefixed JSON-RPC over raw sockets. No pickle. This is the standard transport for all Linux isolation modes (sandboxed and non-sandboxed) and Windows TCP mode.

### UDSTransport
### QueueTransport (Legacy)

Uses Unix Domain Sockets for communication. Used when subprocess isolation is via `bubblewrap` sandbox.
Uses `multiprocessing.Queue` for communication. This is a legacy backward-compatibility path and is not used in current isolation modes.
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Documentation claims QueueTransport is unused, but code says otherwise—a tale of two transports!

The documentation states QueueTransport is "Legacy / backward compatibility only" and "not used in current isolation modes." However, per rpc_protocol.py lines 155-156, QueueTransport is still actively instantiated when recv_queue and send_queue are provided (host-coupled mode). The class in rpc_transports.py also lacks any deprecation notice.

Consider clarifying that:

  • JSONSocketTransport is used for sandboxed and non-sandboxed Linux modes (via UDS)
  • QueueTransport is used for host-coupled isolation mode (via multiprocessing.Queue)

This prevents developers from mistakenly believing QueueTransport is dead code.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/rpc_protocol.md` around lines 221 - 229, Update the docs to reflect that
QueueTransport is still used: state that JSONSocketTransport is the primary
transport for sandboxed and non-sandboxed Linux modes (UDS) and Windows TCP, and
explicitly note that QueueTransport is used for host-coupled isolation mode when
recv_queue/send_queue are provided (as instantiated in rpc_protocol.py), and
remove the wording that QueueTransport is unused/legacy or add a clear
deprecation note only if you plan to deprecate the QueueTransport class in
rpc_transports.py; ensure you reference the transport class names
JSONSocketTransport and QueueTransport and mention the host-coupled mode /
recv_queue/send_queue trigger so readers can find the relevant code.

Comment on lines +160 to 180
class MyAppAdapter(IsolationAdapter):
@property
def identifier(self) -> str:
"""Return unique adapter identifier (e.g., 'comfyui')."""
return "myapp"

def get_path_config(self, module_path: str) -> dict:
"""Configure sys.path for isolated extensions.

Returns:
- preferred_root: Your app's root directory
- additional_paths: Extra paths for imports
"""
return {
"preferred_root": "/path/to/myapp",
"additional_paths": ["/path/to/myapp/extensions"],
}

def setup_child_environment(self, snapshot: dict) -> None:
"""Configure child process after sys.path reconstruction."""
pass # Set up logging, environment, etc.

def register_serializers(self, registry) -> None:
"""Register custom type serializers for RPC transport."""
registry.register(
"MyCustomType",
serializer=lambda obj: {"data": obj.data},
deserializer=lambda d: MyCustomType(d["data"]),
)

def provide_rpc_services(self) -> list:
"""Return ProxiedSingleton classes to expose via RPC."""
return [MyRegistry, MyProgressReporter]

def handle_api_registration(self, api, rpc) -> None:
"""Post-registration hook for API-specific setup."""
pass
```
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Show the remaining adapter hooks, or mark this snippet as partial.

example/bwrap_torch_share/host.py's MinimalAdapter still defines setup_child_environment() and handle_api_registration(), but this README example subclasses IsolationAdapter without showing them. Readers copying it verbatim get a half-deck adapter that looks complete.

🧩 Minimal doc fix
 class MyAppAdapter(IsolationAdapter):
     `@property`
     def identifier(self) -> str:
         return "myapp"
@@
     def provide_rpc_services(self) -> list:
         return [MyRegistry, MyProgressReporter]
+
+    def setup_child_environment(self, snapshot: dict) -> None:
+        pass
+
+    def handle_api_registration(self, api, rpc) -> None:
+        pass
🧰 Tools
🪛 markdownlint-cli2 (0.22.0)

[warning] 171-171: Fenced code blocks should be surrounded by blank lines

(MD031, blanks-around-fences)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@README.md` around lines 160 - 180, The README example for MyAppAdapter
subclasses IsolationAdapter but omits required hooks, causing a misleading
incomplete example; update the MyAppAdapter example to either include the
remaining adapter hooks (implement setup_child_environment and
handle_api_registration with brief stub descriptions matching the MinimalAdapter
pattern) or explicitly mark the snippet as partial by adding a comment like
"partial: additional hooks (setup_child_environment, handle_api_registration)
omitted" so readers know to supply those methods when subclassing
IsolationAdapter or consult example/bwrap_torch_share/host.py's MinimalAdapter
for full implementations.

Comment on lines +89 to +111
wheel_index = "https://pollockjj.github.io/wheels/"
monkeypatch.setenv("UV_EXTRA_INDEX_URL", wheel_index)
print(f"UV_EXTRA_INDEX_URL={wheel_index}")
try:
ext.ensure_process_started()
except RuntimeError as exc:
if "bubblewrap" in str(exc).lower():
pytest.skip(f"bwrap unavailable on this platform: {exc}")
raise
# Verify pyisolate was installed in the child venv from the published index
if os.name == "nt":
child_python = Path(ext.venv_path) / "Scripts" / "python.exe"
else:
child_python = Path(ext.venv_path) / "bin" / "python3"
pip_show = subprocess.run(
[str(child_python), "-m", "pip", "show", "pyisolate"],
capture_output=True,
text=True,
check=False,
)
print(f"child venv pyisolate install:\n{pip_show.stdout}")
assert "pyisolate" in pip_show.stdout, "pyisolate not installed in child venv"
assert "0.10.1" in pip_show.stdout, "pyisolate version mismatch in child venv"
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🌐 Web query:

For uv package resolution, does UV_EXTRA_INDEX_URLonly add an extra index alongside the default index, or can it force installation from that custom index? What is the recommended way to make a custom index primary and verify which index supplied the installedpyisolate wheel?

💡 Result:

UV_EXTRA_INDEX_URL adds an extra index alongside the default index (PyPI). It does not replace the default index or force installation exclusively from the custom index. Extra indexes have higher priority than the default: uv searches extra indexes first (in order), stopping at the first that has the package (default strategy "first-index" for security). To force from custom index only, use UV_INDEX_URL (legacy, deprecated) or UV_DEFAULT_INDEX to replace default, but still alongside extras if set; better use pyproject.toml [[tool.uv.index]] with default=true for primary, listing custom first. Recommended way to make custom index primary: In pyproject.toml or uv.toml: [[tool.uv.index]] name = "custom" url = "https://your-custom/simple/" [[tool.uv.index]] name = "pypi" url = "https://pypi.org/simple/" default = true Custom first (highest priority), PyPI last (lowest). Or via env: UV_INDEX="https://your-custom/simple/" UV_DEFAULT_INDEX="https://pypi.org/simple/". To verify which index supplied pyisolate wheel: pyisolate is on PyPI (pypi.org/project/pyisolate). Run uv pip install pyisolate -v (verbose shows resolution logs, may indicate index queried). Check uv.lock source.registry for package (after uv lock). Or uv pip show pyisolate (basic info, no index). No direct "uv pip show --index", but verbose install or lockfile inspection confirms.

Citations:


Use verbose logging or lockfile inspection to verify the custom index actually supplied the pyisolate wheel.

UV_EXTRA_INDEX_URL does add the custom index as a supplemental source with higher priority than PyPI (uv checks extra indexes first). However, pip show only confirms the installed version—it doesn't reveal which index the wheel actually came from. That leaves the regression test on thin ice, I'm afraid, without explicit source verification.

To confirm provenance: run uv pip install pyisolate -v (verbose logs show which index is queried), or inspect uv.lock for the source.registry field after running uv lock. Either approach will verify the custom wheels index actually supplied the package rather than relying on fallback behavior.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_uv_sealed_integration.py` around lines 89 - 111, The test only
checks pip show output (pip_show) which doesn't prove the wheel came from the
custom index (UV_EXTRA_INDEX_URL); update the test around
ext.ensure_process_started()/child_python/pip_show so it explicitly verifies
provenance by either (a) running the installer via the CLI with verbose logging
(invoke uv pip install pyisolate -v or equivalent through ext or subprocess) and
asserting the verbose output contains the custom index URL, or (b) running the
lock step (uv lock) after installation and inspecting uv.lock for the package
entry's source.registry field to assert it equals the custom index; keep
existing checks for version but add one of these provenance assertions to ensure
the custom index supplied the wheel.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.github/workflows/windows.yml:
- Around line 44-45: The workflow currently runs "python host.py -v" but
host.py's main() accepts no arguments and does no argparse/sys.argv handling, so
the "-v" flag is silently ignored; fix this by either removing the "-v" flag
from the workflow invocation or adding argument parsing inside host.py (e.g.,
update main() to parse a "-v/--verbose" flag and enable verbose logging) so the
flag has effect—refer to host.py and its main() when making the change.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: dc570095-db8a-4cdc-875e-09ebd2c9c4e0

📥 Commits

Reviewing files that changed from the base of the PR and between 780eab1 and 5dc0d53.

📒 Files selected for processing (2)
  • .github/workflows/docs.yml
  • .github/workflows/windows.yml
💤 Files with no reviewable changes (1)
  • .github/workflows/docs.yml

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (1)
CLAUDE.md (1)

57-65: ⚠️ Potential issue | 🟠 Major

Bootstrap command still misses test extras before running pytest.

uv pip install -e ".[dev]" can leave test deps absent on clean setups. Please include test extra in the install command.

🧪 Minimal doc fix
-uv venv && source .venv/bin/activate && uv pip install -e ".[dev]"
+uv venv && source .venv/bin/activate && uv pip install -e ".[dev,test]"
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` around lines 57 - 65, The install command in the CLAUDE.md
bootstrap snippet uses uv pip install -e ".[dev]" which omits test extras;
update that command to include the test extras (e.g. change ".[dev]" to
".[dev,test]" or equivalent) so test dependencies are installed before running
pytest, and ensure the modified string appears in the existing bootstrap line
that includes uv venv && source .venv/bin/activate && uv pip install -e
".[dev]".
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@CLAUDE.md`:
- Line 79: The fenced code block containing the tokens ExtensionBase,
ExtensionManager, ExtensionManagerConfig, ExtensionConfig, SandboxMode,
SealedNodeExtension, ProxiedSingleton, local_execution, singleton_scope,
flush_tensor_keeper, purge_orphan_sender_shm_files, register_adapter and
get_adapter is missing a language tag (triggering MD040); update the opening
fence from ``` to ```text so the block becomes a text‑tagged fenced code block
to satisfy linters and improve rendering.
- Around line 51-52: Update the table entry for `example/` to remove the
incorrect "numpy 1.x vs 2.x" wording and replace it with an accurate one-line
summary that it contains three execution-model demos showing torch isolation
approaches: "torch_share" (host-coupled with shared torch), "sealed_worker"
(fully sealed worker), and "bwrap_torch_share" (sandbox-enforced isolation);
edit the row that currently reads `| example/ | Working 3-extension demo showing
dependency isolation (numpy 1.x vs 2.x) |` to a concise line mentioning those
three demo names and the isolation strategies instead.

---

Duplicate comments:
In `@CLAUDE.md`:
- Around line 57-65: The install command in the CLAUDE.md bootstrap snippet uses
uv pip install -e ".[dev]" which omits test extras; update that command to
include the test extras (e.g. change ".[dev]" to ".[dev,test]" or equivalent) so
test dependencies are installed before running pytest, and ensure the modified
string appears in the existing bootstrap line that includes uv venv && source
.venv/bin/activate && uv pip install -e ".[dev]".
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 88180fb3-5293-4c28-aa3e-683cb65b020b

📥 Commits

Reviewing files that changed from the base of the PR and between 5dc0d53 and 83629b2.

📒 Files selected for processing (1)
  • CLAUDE.md

Comment on lines +51 to +52
| `example/` | Working 3-extension demo showing dependency isolation (numpy 1.x vs 2.x) |
| `benchmarks/` | RPC overhead and memory benchmarks with cross-platform runner scripts |
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Verify example directory naming and listed standalone examples.
fd -t d --hidden --exclude .git -i '^(example|examples)$'
fd -t f --hidden --exclude .git -i 'torch_share|sealed_worker|bwrap_torch_share'
rg -n -C2 'torch_share|sealed_worker|bwrap_torch_share|numpy 1\.x vs 2\.x|example/|examples/' CLAUDE.md README.md

Repository: Comfy-Org/pyisolate

Length of output: 1750


🏁 Script executed:

#!/bin/bash
# Explore the example directory structure
echo "=== Contents of example/ directory ==="
fd -t f -t d . example/ | head -50

echo -e "\n=== Directory tree of example/ ==="
tree example/ -L 3 2>/dev/null || find example/ -type f -o -type d | sort

echo -e "\n=== Count extensions in example/ ==="
find example/ -mindepth 1 -maxdepth 1 -type d | wc -l

echo -e "\n=== Extension directory names ==="
find example/ -mindepth 1 -maxdepth 1 -type d

Repository: Comfy-Org/pyisolate

Length of output: 814


🏁 Script executed:

#!/bin/bash
# Check example file contents for numpy references and extension purpose
echo "=== torch_share/extension/__init__.py ===" 
head -30 example/torch_share/extension/__init__.py

echo -e "\n=== sealed_worker/extension/__init__.py ===" 
head -30 example/sealed_worker/extension/__init__.py

echo -e "\n=== torch_share/host.py (first 40 lines) ===" 
head -40 example/torch_share/host.py

echo -e "\n=== Search for numpy in example/ ===" 
rg -n 'numpy' example/ --max-count=5

echo -e "\n=== Search for dependency or isolation references ===" 
rg -n 'dependency|isolation|1\.x|2\.x' example/ --max-count=10

Repository: Comfy-Org/pyisolate

Length of output: 3779


Update example/ description to reflect actual torch isolation demonstrations, not numpy versions.

The directory contains 3 execution model examples (torch_share, sealed_worker, bwrap_torch_share) demonstrating different isolation strategies—but the docs claim "numpy 1.x vs 2.x" which isn't present in the code. Replace with an accurate summary: demonstrates three isolation approaches (host-coupled with shared torch, fully sealed worker, and sandbox-enforced isolation). No need to let confusion take root (or should we say torch?) in the documentation.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` around lines 51 - 52, Update the table entry for `example/` to
remove the incorrect "numpy 1.x vs 2.x" wording and replace it with an accurate
one-line summary that it contains three execution-model demos showing torch
isolation approaches: "torch_share" (host-coupled with shared torch),
"sealed_worker" (fully sealed worker), and "bwrap_torch_share" (sandbox-enforced
isolation); edit the row that currently reads `| example/ | Working 3-extension
demo showing dependency isolation (numpy 1.x vs 2.x) |` to a concise line
mentioning those three demo names and the isolation strategies instead.

Copy link
Copy Markdown

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

♻️ Duplicate comments (3)
CLAUDE.md (1)

51-52: ⚠️ Potential issue | 🟡 Minor

Update example/ description—numpy version demo doesn't exist.

The table still claims the example shows "dependency isolation (numpy 1.x vs 2.x)", but the directory actually contains three torch isolation demos: torch_share, sealed_worker, and bwrap_torch_share. This mismatch was flagged in past review and remains unaddressed.

NumPy or not NumPy, that is the question! 🎭 But here, the answer is definitely "not NumPy."

📝 Proposed fix
-| `example/` | Working 3-extension demo showing dependency isolation (numpy 1.x vs 2.x) |
+| `example/` | Three isolation demos: `torch_share` (host-coupled), `sealed_worker` (fully sealed), `bwrap_torch_share` (sandboxed) |
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@CLAUDE.md` around lines 51 - 52, The table entry for `example/` is
inaccurate—replace the "dependency isolation (numpy 1.x vs 2.x)" description
with one that reflects the actual contents: mention the three torch isolation
demos (`torch_share`, `sealed_worker`, `bwrap_torch_share`) and describe them
briefly (e.g., "Three torch isolation demos demonstrating dependency isolation
approaches: torch_share, sealed_worker, and bwrap_torch_share"). Update the
`example/` row so the README/table text matches the actual example directory
contents.
example/bwrap_torch_share/host.py (1)

109-118: 🧹 Nitpick | 🔵 Trivial

Consider adding a tensor round-trip to validate share_torch actually works.

The example claims to demonstrate "sandbox-enforced host-coupled isolation" with share_torch=True, but only tests ping(). A ping() success doesn't prove tensor serialization via /dev/shm works correctly.

Adding a simple tensor echo (create tensor → send via proxy → receive → assert equality) would validate the share_torch machinery is actually functioning. Otherwise we're just torching through the motions! 🔥

🧪 Proposed tensor round-trip validation
         proxy = ext.get_proxy()
         result = await proxy.ping()
         logger.info(f"ping result: {result}")

+        # Validate tensor sharing actually works
+        import torch
+        test_tensor = torch.tensor([[1.0, 2.0], [3.0, 4.0]])
+        echoed = await proxy.compute(42)  # Uses torch internally
+        if not echoed.get("torch_available"):
+            logger.error("FAIL — torch not available in child despite share_torch=True")
+            return 1
+
         if result == "pong_torch_share":
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@example/bwrap_torch_share/host.py` around lines 109 - 118, Add a real tensor
round-trip after the ping to validate share_torch actually passes tensors:
create a small CPU torch tensor on the host (torch.tensor([...]) ), call a proxy
tensor-echo RPC (e.g., ext.get_proxy() -> proxy.echo_tensor(tensor) or
proxy.round_trip_tensor(tensor)), receive the tensor back, compare with
torch.equal or torch.allclose, and only log PASS and return 0 if equality holds
(otherwise log FAIL and return 1); keep the existing ping check but extend the
branch that handles "pong_torch_share" to perform this tensor
serialization/deserialization test so share_torch=True is actually exercised.
pyisolate/_internal/environment.py (1)

381-382: ⚠️ Potential issue | 🟠 Major

Handle pyisolate==0.0.0 edge case to prevent sealed worker installation failure in development environments.

When running from a source checkout without formal installation, pyisolate_version falls back to "0.0.0" (per _detect_pyisolate_version() at line 239). This will cause uv pip install pyisolate==0.0.0 to fail since that version doesn't exist on PyPI.

Add a conditional check to fall back to the local source path when pyisolate_version == "0.0.0", ensuring sealed workers can be created during development—no need to seal off your dev workflow! 🦭

Proposed fix
     if config.get("execution_model") == "sealed_worker":
-        safe_deps.append(f"pyisolate=={pyisolate_version}")
+        if pyisolate_version != "0.0.0":
+            safe_deps.append(f"pyisolate=={pyisolate_version}")
+        else:
+            # Fallback for development: install from source directory
+            safe_deps.append(str(Path(__file__).resolve().parents[2]))
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@pyisolate/_internal/environment.py` around lines 381 - 382, The code
currently appends pyisolate=={pyisolate_version} to safe_deps when
config.get("execution_model") == "sealed_worker", but
_detect_pyisolate_version() can return "0.0.0" for source checkouts which will
make pip install fail; update the block in environment.py that handles
execution_model == "sealed_worker" to check if pyisolate_version == "0.0.0" and,
in that case, append the local source path (e.g., the package directory or an
editable/local install spec) to safe_deps instead of the versioned pip spec;
keep the existing behavior for real versions and ensure you reference the
pyisolate_version variable and the safe_deps list so sealed worker creation
works in development without attempting to pip install a non-existent version.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@example/torch_share/host.py`:
- Around line 41-46: The current register_serializers method swallows all
exceptions, hiding failures; update it so the except block logs the failure
(including the exception message/trace) instead of silently passing—e.g., catch
broad Exception only to log via the module/process logger or a provided logger
with context like "tensor serializer registration failed" and include the
exception, while still allowing the program to continue; ensure references to
register_serializers, SerializerRegistryProtocol, serialize_tensor, and
deserialize_tensor remain intact so the registration attempt is preserved.

In `@tests/integration_v2/test_rpc_coverage.py`:
- Around line 86-93: The Extension instance is started before it's registered
for teardown, risking leaks if startup fails; move the registration so that
after creating ext (Extension(...)) you immediately append it to host.extensions
(reference: Extension constructor and host.extensions list) and only then call
ext.ensure_process_started(); this ensures the fixture will always track and
cleanup the extension even if ensure_process_started() raises or fails.

In `@tests/test_uv_sealed_integration.py`:
- Around line 99-116: The test currently verifies installation and version via
child_python and pip_show but doesn't prove the package came from
UV_EXTRA_INDEX_URL; update the test to assert provenance by invoking a verbose
install through the child venv (e.g., run [str(child_python), "-m", "uv", "pip",
"install", "pyisolate", "-v", "--dry-run"] with env including
"UV_EXTRA_INDEX_URL": wheel_index) and assert wheel_index appears in the
captured stdout/stderr, or alternatively open and inspect the produced uv.lock
(after the real install) to verify the source URL; keep the existing version
checks (expected_version) but add this extra provenance assertion referencing
child_python, wheel_index, and expected_version.

---

Duplicate comments:
In `@CLAUDE.md`:
- Around line 51-52: The table entry for `example/` is inaccurate—replace the
"dependency isolation (numpy 1.x vs 2.x)" description with one that reflects the
actual contents: mention the three torch isolation demos (`torch_share`,
`sealed_worker`, `bwrap_torch_share`) and describe them briefly (e.g., "Three
torch isolation demos demonstrating dependency isolation approaches:
torch_share, sealed_worker, and bwrap_torch_share"). Update the `example/` row
so the README/table text matches the actual example directory contents.

In `@example/bwrap_torch_share/host.py`:
- Around line 109-118: Add a real tensor round-trip after the ping to validate
share_torch actually passes tensors: create a small CPU torch tensor on the host
(torch.tensor([...]) ), call a proxy tensor-echo RPC (e.g., ext.get_proxy() ->
proxy.echo_tensor(tensor) or proxy.round_trip_tensor(tensor)), receive the
tensor back, compare with torch.equal or torch.allclose, and only log PASS and
return 0 if equality holds (otherwise log FAIL and return 1); keep the existing
ping check but extend the branch that handles "pong_torch_share" to perform this
tensor serialization/deserialization test so share_torch=True is actually
exercised.

In `@pyisolate/_internal/environment.py`:
- Around line 381-382: The code currently appends pyisolate=={pyisolate_version}
to safe_deps when config.get("execution_model") == "sealed_worker", but
_detect_pyisolate_version() can return "0.0.0" for source checkouts which will
make pip install fail; update the block in environment.py that handles
execution_model == "sealed_worker" to check if pyisolate_version == "0.0.0" and,
in that case, append the local source path (e.g., the package directory or an
editable/local install spec) to safe_deps instead of the versioned pip spec;
keep the existing behavior for real versions and ensure you reference the
pyisolate_version variable and the safe_deps list so sealed worker creation
works in development without attempting to pip install a non-existent version.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: Organization UI

Review profile: ASSERTIVE

Plan: Pro

Run ID: 97b1655c-a438-4c5f-9a0d-5425bf514035

📥 Commits

Reviewing files that changed from the base of the PR and between 83629b2 and b01ed96.

📒 Files selected for processing (9)
  • .github/workflows/pytorch.yml
  • .github/workflows/windows.yml
  • CLAUDE.md
  • example/bwrap_torch_share/host.py
  • example/sealed_worker/host.py
  • example/torch_share/host.py
  • pyisolate/_internal/environment.py
  • tests/integration_v2/test_rpc_coverage.py
  • tests/test_uv_sealed_integration.py

Comment on lines +41 to +46
def register_serializers(self, registry: SerializerRegistryProtocol) -> None:
try:
from pyisolate._internal.tensor_serializer import deserialize_tensor, serialize_tensor
registry.register("torch.Tensor", serialize_tensor, deserialize_tensor)
except Exception:
pass
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Silent exception swallowing may hide registration failures.

The bare except Exception: pass silently ignores all errors during tensor serializer registration. If torch is available but registration fails for a legitimate reason (e.g., API mismatch), this would be invisible.

Consider at minimum logging a debug message when registration fails—don't let errors vanish into thin air! ✨

🔊 Proposed fix to log registration failures
     def register_serializers(self, registry: SerializerRegistryProtocol) -> None:
         try:
             from pyisolate._internal.tensor_serializer import deserialize_tensor, serialize_tensor
             registry.register("torch.Tensor", serialize_tensor, deserialize_tensor)
-        except Exception:
-            pass
+        except Exception as exc:
+            logger.debug("Tensor serializer registration skipped: %s", exc)
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@example/torch_share/host.py` around lines 41 - 46, The current
register_serializers method swallows all exceptions, hiding failures; update it
so the except block logs the failure (including the exception message/trace)
instead of silently passing—e.g., catch broad Exception only to log via the
module/process logger or a provided logger with context like "tensor serializer
registration failed" and include the exception, while still allowing the program
to continue; ensure references to register_serializers,
SerializerRegistryProtocol, serialize_tensor, and deserialize_tensor remain
intact so the registration attempt is preserved.

Comment on lines +86 to +93
ext = Extension(
module_path=package_path,
extension_type=ReferenceTestExtension,
config=config,
venv_root_path=str(host.venv_root),
)
ext.ensure_process_started()
host.extensions.append(ext)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Register extension for teardown before starting the process.

On Line 92, startup happens before adding to host.extensions. If startup fails after partial spawn, Line 93 is never reached and fixture cleanup won’t track this extension—flaky tests may follow (a little leak, a lot of bleak).

Suggested fix
         ext = Extension(
             module_path=package_path,
             extension_type=ReferenceTestExtension,
             config=config,
             venv_root_path=str(host.venv_root),
         )
-        ext.ensure_process_started()
-        host.extensions.append(ext)
+        host.extensions.append(ext)
+        ext.ensure_process_started()
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
ext = Extension(
module_path=package_path,
extension_type=ReferenceTestExtension,
config=config,
venv_root_path=str(host.venv_root),
)
ext.ensure_process_started()
host.extensions.append(ext)
ext = Extension(
module_path=package_path,
extension_type=ReferenceTestExtension,
config=config,
venv_root_path=str(host.venv_root),
)
host.extensions.append(ext)
ext.ensure_process_started()
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/integration_v2/test_rpc_coverage.py` around lines 86 - 93, The
Extension instance is started before it's registered for teardown, risking leaks
if startup fails; move the registration so that after creating ext
(Extension(...)) you immediately append it to host.extensions (reference:
Extension constructor and host.extensions list) and only then call
ext.ensure_process_started(); this ensures the fixture will always track and
cleanup the extension even if ensure_process_started() raises or fails.

Comment on lines +99 to +116
# Verify pyisolate was installed in the child venv from the published index
if os.name == "nt":
child_python = Path(ext.venv_path) / "Scripts" / "python.exe"
else:
child_python = Path(ext.venv_path) / "bin" / "python3"
pip_show = subprocess.run(
[str(child_python), "-m", "pip", "show", "pyisolate"],
capture_output=True,
text=True,
check=False,
)
print(f"child venv pyisolate install:\n{pip_show.stdout}")
assert "pyisolate" in pip_show.stdout, "pyisolate not installed in child venv"
from importlib.metadata import version as _meta_version
expected_version = _meta_version("pyisolate")
assert expected_version in pip_show.stdout, (
f"pyisolate version mismatch: expected {expected_version} in child venv"
)
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick | 🔵 Trivial

Verification checks version match but not index provenance.

The test confirms pyisolate is installed and the version matches, which is a reasonable smoke test. However, as noted in past review, pip show doesn't prove the wheel came from UV_EXTRA_INDEX_URL rather than PyPI fallback.

For a more robust provenance check, consider running uv pip install -v and grepping for the index URL in the verbose output, or inspecting uv.lock after installation. That said, for a regression test this level of verification may be sufficient—no need to go down the rabbit hole! 🐰

🔍 Optional: Add verbose install logging for provenance verification
# Alternative approach: Run install with verbose logging to verify index source
install_result = subprocess.run(
    [str(child_python), "-m", "uv", "pip", "install", "pyisolate", "-v", "--dry-run"],
    capture_output=True,
    text=True,
    check=False,
    env={**os.environ, "UV_EXTRA_INDEX_URL": wheel_index},
)
assert wheel_index in install_result.stdout or wheel_index in install_result.stderr
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/test_uv_sealed_integration.py` around lines 99 - 116, The test
currently verifies installation and version via child_python and pip_show but
doesn't prove the package came from UV_EXTRA_INDEX_URL; update the test to
assert provenance by invoking a verbose install through the child venv (e.g.,
run [str(child_python), "-m", "uv", "pip", "install", "pyisolate", "-v",
"--dry-run"] with env including "UV_EXTRA_INDEX_URL": wheel_index) and assert
wheel_index appears in the captured stdout/stderr, or alternatively open and
inspect the produced uv.lock (after the real install) to verify the source URL;
keep the existing version checks (expected_version) but add this extra
provenance assertion referencing child_python, wheel_index, and
expected_version.

@pollockjj pollockjj closed this Apr 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants