The orchestrator dispatches sub-agent calls to one of three CLIs based on the
SUBAGENT_CLIenvironment variable. Default unset =claude(byte-identical backwards-compat with v0.9.2.0).
| Goal | SUBAGENT_CLI |
Why |
|---|---|---|
| Default — Anthropic / LiteLLM gateway | claude (or unset) |
Native Claude Code; cost reporting; max-turns enforced |
| OpenAI Codex (gpt-5-codex etc.) | codex |
First-party OpenAI tooling; --ephemeral runs; no built-in cost reporting |
| OpenRouter / qwen / DeepSeek / OSS / Bedrock-via-LiteLLM | opencode |
75+ provider router; per-step cost when provider reports it |
- Pull the latest
open-computer-useimage (codex + opencode are pre-installed alongside claude). - Set
SUBAGENT_CLI=<value>in your.env. - Set the per-CLI auth env vars (see the per-CLI sections below).
- Restart the orchestrator:
docker compose up -d --force-recreate computer-use-server. - Verify the runtime took effect:
docker compose logs computer-use-server | grep "Sub-agent runtime" # Expected: [MCP] Sub-agent runtime: <value>
- Spawn a sandbox (any chat or
/healthpoke) and verify the CLI is on PATH:docker exec <sandbox-container> <cli> --version # Expected: a non-zero version string
This is the default path. If you previously set SUBAGENT_CLI=codex or =opencode and want to revert, either delete the line from .env or set SUBAGENT_CLI=claude explicitly — both resolve identically.
For Anthropic / LiteLLM gateway configuration, see docs/claude-code-gateway.md.
Add to .env:
SUBAGENT_CLI=codex
OPENAI_API_KEY=sk-...
# Optional gateway (Azure OpenAI, LiteLLM proxy, etc.):
OPENAI_BASE_URL=https://your-litellm-proxy/v1
# Optional per-CLI default model:
CODEX_SUB_AGENT_DEFAULT_MODEL=gpt-5-codexRestart per common steps. The container's ~/.codex/config.toml is rendered conditionally:
- with
OPENAI_BASE_URLset → contains a[model_providers.custom]block pointing at your gateway; - without it → empty file (Codex uses defaults).
Verify:
docker exec <sandbox> codex --version # Expect: codex-cli 0.125.0
docker exec <sandbox> cat ~/.codex/config.tomlThis is the headline recipe — runs sub-agents against a frontier OSS coding model with no Anthropic dependency.
Add to .env:
SUBAGENT_CLI=opencode
OPENROUTER_API_KEY=sk-or-v1-...
OPENCODE_SUB_AGENT_DEFAULT_MODEL=openrouter/qwen/qwen-3-coderRestart:
docker compose up -d --force-recreate computer-use-serverVerify the orchestrator picked up the runtime:
docker compose logs computer-use-server | grep "Sub-agent runtime"
# Expected: [MCP] Sub-agent runtime: opencodeSpawn any sandbox and verify the OpenCode config is rendered without leaking the key:
docker exec <sandbox> cat /tmp/opencode.json
# Expected: provider.openrouter.options.apiKey is "{env:OPENROUTER_API_KEY}" — NOT a literal sk-or-v1-... value
# OpenCode 1.14.x schema: top-level key is "provider" (singular), apiKey nested under "options".The {env:VAR} syntax means OpenCode resolves the key at runtime from the container env. The file on disk contains zero plaintext secrets — the sandbox volume can be mounted, copied, or shared without leaking your OpenRouter key.
Trigger a sub-agent call from the chat (or via the MCP sub_agent tool). Expected response shape:
**Sub-Agent Completed** (success)
<the qwen3-coder reply>
**Cost:** unavailable | **Duration:** 12.3s | **Turns:** unavailable
Cost: unavailable is expected for opencode runs — see the next section.
| Aspect | claude | codex | opencode |
|---|---|---|---|
| Cost reporting | reported as USD | unavailable | depends on provider (some report per-step cost) |
max_turns enforcement |
enforced (CLI flag) | not enforced — SUB_AGENT_TIMEOUT is the backstop |
not enforced — SUB_AGENT_TIMEOUT is the backstop |
resume_session_id |
supported | ignored with stderr warning (--ephemeral is stateless) |
ignored with stderr warning |
Model alias sonnet / opus / haiku |
resolves to Claude IDs | hard-fail with actionable error message | resolves to anthropic/claude-X-X provider/model |
Direct provider/model strings (e.g. openrouter/qwen/qwen-3-coder) |
pass-through | pass-through | pass-through |
~/.claude/projects/*.jsonl live log streaming |
yes | no | no |
| Image install | always (pre-installed) | always (pre-installed) | always (pre-installed) |
If you set a Claude alias (sonnet/opus/haiku) while SUBAGENT_CLI=codex, the orchestrator hard-fails with a clear error rather than silently 400-ing against OpenAI:
Model alias 'sonnet' is Claude-only; SUBAGENT_CLI=codex requires a GPT model id
(e.g. 'gpt-5-codex') or set CODEX_SUB_AGENT_DEFAULT_MODEL.
SUBAGENT_CLI makes the in-browser ttyd terminal auto-launch the chosen CLI. To get a plain bash prompt instead:
# Per-session (in a new terminal tab):
NO_AUTOSTART=1 bash
# OR persistently for this container (next ttyd session):
touch /tmp/.no_autostartThe hint also appears in the entrypoint banner when you start the container.
The orchestrator injects only the active CLI's auth env vars into the sandbox container. Concretely:
SUBAGENT_CLI=claude→ onlyANTHROPIC_AUTH_TOKENandANTHROPIC_BASE_URLreach the sandbox;OPENAI_API_KEYandOPENROUTER_API_KEYare stripped even if set on the host.SUBAGENT_CLI=codex→ onlyOPENAI_*andAZURE_OPENAI_*keys reach the sandbox;ANTHROPIC_AUTH_TOKENandOPENROUTER_API_KEYare stripped.SUBAGENT_CLI=opencode→ onlyOPENROUTER_API_KEY,OPENAI_API_KEY,ANTHROPIC_API_KEYreach the sandbox;ANTHROPIC_AUTH_TOKEN(the legacy Claude key) is stripped.
This prevents an operator's leftover OPENAI_API_KEY (from a previous Codex experiment) from silently routing OpenCode traffic through OpenAI when they meant OpenRouter.
The image entrypoint renders minimal viable configs only — what works for the common case. For Azure routing, approval modes, MCP federation, custom OpenAI-compat gateways behind nginx, opencode personas, and operator-supplied overrides via OPENCODE_CONFIG_EXTRA / CODEX_CONFIG_EXTRA env hooks, see docs/cli-config-templates.md.
- Banner shows the wrong CLI —
SUBAGENT_CLIis read once at orchestrator boot, not per-request. Restart:docker compose restart computer-use-server. SUBAGENT_CLI=cline(typo) → orchestrator refuses to start — this is intentional. Fix the typo; checkdocker compose logs computer-use-serverfor the FATAL line listing the three accepted values.- OpenCode falls back to a default provider — verify
/tmp/opencode.jsonexists in the sandbox; if not, the container needs--force-recreateto re-render it via the entrypoint heredoc. - Cost reads
$0.0000for codex/opencode — this is a bug; the expected display iscost: unavailable. File an issue with the result blob attached. - Sub-agent for codex/opencode runs forever —
max_turnsis Claude-only; the backstop for the other two isSUB_AGENT_TIMEOUT(default 3600s). Lower it in.envif you need a tighter cap:SUB_AGENT_TIMEOUT=1800.
- OpenAI Codex CLI documentation —
codex execflags, JSONL event schema - sst/opencode documentation —
opencode run,{env:VAR}config substitution, providers list - OpenRouter qwen3-coder model page
- Issue #40 / PR #41 — community discussion that informed Phase 3 (Claude Code gateway compatibility), the foundation this milestone builds on.