From 592476e86baa07579166c2ae325546dcd8c87460 Mon Sep 17 00:00:00 2001 From: Alezander9 Date: Fri, 15 May 2026 12:10:39 -0700 Subject: [PATCH] fix(provider): prefer per-model temperature over agent override `ProviderTransform.temperature(model)` encodes per-model-family temperature values that the model's provider requires (e.g. Moonshot's kimi-k2.* returns HTTP 400 "invalid temperature: only 1 is allowed for this model" if any other value is sent). The session/llm.ts call site consulted it in the wrong order: input.agent.temperature ?? ProviderTransform.temperature(input.model) so the built-in title agent's hard-coded `temperature: 0.5` (agent.ts:253) won over the kimi-required 1.0, and Moonshot rejected the request. Swap the operands. The transform value, when present, is the model-aware answer and should beat a generic agent default. For models that aren't in the transform map, the function returns `undefined` and the agent's preference still applies through `??` fallthrough. Affects: kimi-k2.* (incl. k2.6, thinking, k2-5), gemini, glm-4.6/4.7, minimax-m2, qwen - anywhere the transform has an entry. --- packages/opencode/src/session/llm.ts | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/packages/opencode/src/session/llm.ts b/packages/opencode/src/session/llm.ts index 0cf3a2398f9b..279877948812 100644 --- a/packages/opencode/src/session/llm.ts +++ b/packages/opencode/src/session/llm.ts @@ -169,7 +169,7 @@ const live: Layer.Layer< }, { temperature: input.model.capabilities.temperature - ? (input.agent.temperature ?? ProviderTransform.temperature(input.model)) + ? (ProviderTransform.temperature(input.model) ?? input.agent.temperature) : undefined, topP: input.agent.topP ?? ProviderTransform.topP(input.model), topK: ProviderTransform.topK(input.model),