Scope check
Due diligence
What problem does this solve?
The OpenAI model gpt-5.2-codex is listed in the model registry. These codex models are trained for programming-specific performance and can be useful to chat with for programming-related use cases.
However, when I try to use it with RubyLLM, I get an error:
chat = RubyLLM.chat(model: "gpt-5.2-codex")
=>
#<RubyLLM::Chat:0x00007b6467b7c4f0
...
chat.ask("Write a Python function that adds two numbers.")
in '<main>': This is not a chat model and thus not supported in the v1/chat/completions endpoint. Did you mean to use v1/completions? (RubyLLM::Error)
(The above works fine if I use model: "gpt-5.2". The problem is specifically with model: "gpt-5.2-codex".)
Proposed solution
I'm not sure what the easiest implementation in RubyLLM would be.
However, I do know that it is possible to have a multiturn chat experience with gpt-5.2-codex when using the official openai gem. But doing so gives up all the convenient developer affordances that RubyLLM provides, so I'd rather be able to use this model within RubyLLM! 😄
Why this belongs in RubyLLM
It's still multiturn text conversations which fit the RubyLLM.chat concept. I have a feeling that codex type use cases are desired by other RubyLLM users.
Scope check
Due diligence
What problem does this solve?
The OpenAI model
gpt-5.2-codexis listed in the model registry. Thesecodexmodels are trained for programming-specific performance and can be useful to chat with for programming-related use cases.However, when I try to use it with RubyLLM, I get an error:
(The above works fine if I use
model: "gpt-5.2". The problem is specifically withmodel: "gpt-5.2-codex".)Proposed solution
I'm not sure what the easiest implementation in RubyLLM would be.
However, I do know that it is possible to have a multiturn chat experience with
gpt-5.2-codexwhen using the officialopenaigem. But doing so gives up all the convenient developer affordances that RubyLLM provides, so I'd rather be able to use this model within RubyLLM! 😄Why this belongs in RubyLLM
It's still multiturn text conversations which fit the
RubyLLM.chatconcept. I have a feeling thatcodextype use cases are desired by other RubyLLM users.