Skip to content

fix(opencode): prefer per-model temperature over agent override#27797

Open
Alezander9 wants to merge 2 commits into
anomalyco:devfrom
Alezander9:fix/strict-temperature-required-models
Open

fix(opencode): prefer per-model temperature over agent override#27797
Alezander9 wants to merge 2 commits into
anomalyco:devfrom
Alezander9:fix/strict-temperature-required-models

Conversation

@Alezander9
Copy link
Copy Markdown

Issue for this PR

Closes #27796

Type of change

  • Bug fix
  • New feature
  • Refactor / code improvement
  • Documentation

What does this PR do?

packages/opencode/src/session/llm.ts:172 resolves temperature as:

input.agent.temperature ?? ProviderTransform.temperature(input.model)

The built-in title agent hard-codes temperature: 0.5 (packages/opencode/src/agent/agent.ts:253), so ?? short-circuits and the agent default always wins. For model families where the provider strictly requires a specific temperature (kimi-k2.* in particular), ProviderTransform.temperature encodes the required value, but it never gets used. Moonshot returns HTTP 400 invalid temperature: only 1 is allowed for this model as a result.

The fix swaps the operands so the model-aware transform wins, with the agent preference as fallback:

ProviderTransform.temperature(input.model) ?? input.agent.temperature

For models without a transform entry (claude, openai, etc.) the function returns undefined and the agent preference still applies via ??. The only behavior change is on the families already flagged in ProviderTransform as needing specific values.

This also aligns temperature with the adjacent topK (no agent override) and is closer in spirit to topP (still agent ?? transform, but topP has no known strict-required cases today).

How did you verify your code works?

  • bun typecheck clean.
  • Existing test/provider/transform.test.ts suite passes locally.
  • Reproduced the original 400 against Moonshot kimi-k2.6 by starting a session that triggers the title agent; with the patch the request is sent with the kimi-required temperature: 1 and the call succeeds.

A reviewer can reproduce by configuring Moonshot kimi-k2.6 and starting any new session; without the patch the title-generation call 400s, with the patch it does not.

Screenshots / recordings

N/A - no UI change.

Checklist

  • I have tested my changes locally
  • I have not included unrelated changes in this PR

Alezander9 and others added 2 commits May 15, 2026 12:10
`ProviderTransform.temperature(model)` encodes per-model-family temperature
values that the model's provider requires (e.g. Moonshot's kimi-k2.* returns
HTTP 400 "invalid temperature: only 1 is allowed for this model" if any
other value is sent). The session/llm.ts call site consulted it in the wrong
order:

  input.agent.temperature ?? ProviderTransform.temperature(input.model)

so the built-in title agent's hard-coded `temperature: 0.5` (agent.ts:253)
won over the kimi-required 1.0, and Moonshot rejected the request.

Swap the operands. The transform value, when present, is the model-aware
answer and should beat a generic agent default. For models that aren't in
the transform map, the function returns `undefined` and the agent's
preference still applies through `??` fallthrough.

Affects: kimi-k2.* (incl. k2.6, thinking, k2-5), gemini, glm-4.6/4.7,
minimax-m2, qwen - anywhere the transform has an entry.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Title-agent temperature 0.5 overrides model-required temperature, causing 400s on Moonshot kimi-k2.*

1 participant