Feature Request
Add support for passing model-specific request parameters through configuration.
Use Case
OpenAI GPT-5.1 models support additional request parameters that control reasoning behavior:
reasoningEffort: "low" | "medium" | "high"
reasoningSummary: "auto" | "detailed"
textVerbosity: "low" | "medium" | "high"
include: ["reasoning.encrypted_content"]
store: boolean
Currently, these cannot be configured per model mapping. The actual OpenAI models are just:
gpt-5.1
gpt-5.1-codex
gpt-5.1-codex-mini
But with different parameters, they behave as distinct models (e.g., gpt-5.1-low, gpt-5.1-medium, gpt-5.1-high).
Proposed Solution
Extend model mapping configuration to support optional parameters:
[[models]]
name = "gpt-5.1-high"
[[models.mappings]]
actual_model = "gpt-5.1"
priority = 1
provider = "openai-codex"
options = { reasoningEffort = "high", reasoningSummary = "detailed", textVerbosity = "high", include = ["reasoning.encrypted_content"], store = false }
[[models]]
name = "gpt-5.1-low"
[[models.mappings]]
actual_model = "gpt-5.1"
priority = 1
provider = "openai-codex"
options = { reasoningEffort = "low", reasoningSummary = "auto", textVerbosity = "medium", include = ["reasoning.encrypted_content"], store = false }
Real-World Example
OpenCode uses this pattern:
"gpt-5.1-codex-high": {
"name": "GPT 5.1 Codex High (OAuth)",
"limit": {
"context": 272000,
"output": 128000
},
"options": {
"reasoningEffort": "high",
"reasoningSummary": "detailed",
"textVerbosity": "medium",
"include": ["reasoning.encrypted_content"],
"store": false
}
}
Implementation Notes
Would require:
- Add
options: Option<HashMap<String, serde_json::Value>> to ModelMapping struct
- Extend provider request structs (e.g.,
OpenAIRequest) with optional parameter fields
- Merge model-specific options into request body when constructing provider requests
- Update config parsing to handle nested options in TOML
References
Feature Request
Add support for passing model-specific request parameters through configuration.
Use Case
OpenAI GPT-5.1 models support additional request parameters that control reasoning behavior:
reasoningEffort: "low" | "medium" | "high"reasoningSummary: "auto" | "detailed"textVerbosity: "low" | "medium" | "high"include: ["reasoning.encrypted_content"]store: booleanCurrently, these cannot be configured per model mapping. The actual OpenAI models are just:
gpt-5.1gpt-5.1-codexgpt-5.1-codex-miniBut with different parameters, they behave as distinct models (e.g., gpt-5.1-low, gpt-5.1-medium, gpt-5.1-high).
Proposed Solution
Extend model mapping configuration to support optional parameters:
Real-World Example
OpenCode uses this pattern:
Implementation Notes
Would require:
options: Option<HashMap<String, serde_json::Value>>toModelMappingstructOpenAIRequest) with optional parameter fieldsReferences