Skip to content

"The requested model 'gpt-5.3-codex' does not exist." #539

@aschoettler

Description

@aschoettler

What version of Code is running?

code 0.6.59

Which model were you using?

gpt-5.3-codex

What platform is your computer?

Linux 6.12.67 aarch64 unknown

What terminal emulator and version are you using (if applicable)?

Ghostty

What steps can reproduce the bug?

I re-authenticated to codex and coder via Chat GPT, but I'm getting an error when using gpt-5.3-codex:

ERROR: [transport] failed to start stream: unexpected status 400 Bad Request: {
  "error": {
    "message": "The requested model 'gpt-5.3-codex' does not exist.",
    "type": "invalid_request_error",
    "param": "model",
    "code": "model_not_found"
  }
}

To troubleshoot the issue I tried setting CODEX_HOME to ~/.codex instead of ~/.code and still had the same issues.

To reproduce:

which coder
coder --version
CODEX_HOME=~/.codex coder login status
CODEX_HOME=~/.codex coder exec --skip-git-repo-check -c model="gpt-5.2-codex" "ping"
CODEX_HOME=~/.codex codex exec --skip-git-repo-check -c model="gpt-5.3-codex" "ping"
CODEX_HOME=~/.codex coder exec --skip-git-repo-check -c model="gpt-5.3-codex" "ping"

Output:

/run/current-system/sw/bin/coder
code 0.6.59
Logged in using ChatGPT

The next outputs show that gpt-5.3-codex is authenticaed for codex, but won't work for code:

[2026-02-06T22:25:44] OpenAI Codex v0.0.0 (research preview)
[2026-02-06T22:25:44] binary: /nix/store/n7c7f1dh4483ax8pwd3y3c4b8kd0ws0v-coder-0.6.59/bin/coder
--------
workdir: /home/user
model: gpt-5.2-codex
provider: openai
approval: never
sandbox: workspace-write [workdir, /tmp, $TMPDIR] (network access enabled)
reasoning effort: medium
reasoning summaries: auto
--------
[2026-02-06T22:25:44] User instructions:
ping

[2026-02-06T22:25:45] thinking

**Noting need to respond to ping**
[2026-02-06T22:25:45] codex

pong
[2026-02-06T22:25:45] tokens used: 6,487```
OpenAI Codex v0.98.0 (research preview)
--------
workdir: /home/user
model: gpt-5.3-codex
provider: openai
approval: never
sandbox: workspace-write [workdir, /tmp, $TMPDIR] (network access enabled)
reasoning effort: medium
reasoning summaries: auto
session id: 019c3510-acc2-7842-abae-ee7603a6f80f
--------
user
ping
mcp startup: no servers

thinking
**Responding with pong**
codex
pong
tokens used
90
pong
[2026-02-06T22:27:16] OpenAI Codex v0.0.0 (research preview)
[2026-02-06T22:27:16] binary: /nix/store/n7c7f1dh4483ax8pwd3y3c4b8kd0ws0v-coder-0.6.59/bin/coder
--------
workdir: /home/user
model: gpt-5.3-codex
provider: openai
approval: never
sandbox: workspace-write [workdir, /tmp, $TMPDIR] (network access enabled)
reasoning effort: medium
reasoning summaries: auto
--------
[2026-02-06T22:27:16] User instructions:
ping
[2026-02-06T22:27:17] ERROR: [transport] failed to start stream: unexpected status 400 Bad Request: {
  "error": {
    "message": "The requested model 'gpt-5.3-codex' does not exist.",
    "type": "invalid_request_error",
    "param": "model",
    "code": "model_not_found"
  }
}

What is the expected behavior?

  • "pong" response

What do you see instead?

The requested model 'gpt-5.3-codex' does not exist.

Additional information

  • I'm currently using a Chat GPT business account
  • I don't have an API key and API fallback is OFF in the code settings.
  • GPT's hypothesis: code is still routing through the OpenAI API provider even with ChatGPT login, and gpt-5.3-codex is not exposed on the API yet.
  • The environment has no OPENAI_API_KEY set, no OPENAI_BASE_URL, and no configured model-provider overrides.

Thanks for your help and awesome work! Let me know if I can do any other steps to help debug.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions