fix(provider): prefer per-model temperature over agent override#64
Open
Alezander9 wants to merge 1 commit into
Open
fix(provider): prefer per-model temperature over agent override#64Alezander9 wants to merge 1 commit into
Alezander9 wants to merge 1 commit into
Conversation
`ProviderTransform.temperature(model)` encodes per-model-family temperature values that the model's provider requires (e.g. Moonshot's kimi-k2.* returns HTTP 400 "invalid temperature: only 1 is allowed for this model" if any other value is sent). The session/llm.ts call site consulted it in the wrong order: input.agent.temperature ?? ProviderTransform.temperature(input.model) so the built-in title agent's hard-coded `temperature: 0.5` (agent.ts:271) won over the kimi-required 1.0, and Moonshot rejected the request. Swap the operands. The transform value, when present, is the model-aware answer and should beat a generic agent default. For models that aren't in the transform map, the function returns `undefined` and the agent's preference still applies through `??` fallthrough. Affects: kimi-k2.* (incl. k2.6, thinking, k2-5), gemini, glm-4.6/4.7, minimax-m2, qwen — anywhere the transform has an entry.
29f118e to
f386eb3
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Problem
ProviderTransform.temperature(model)insrc/provider/transform.tsreturns the temperature value a given model family requires (kimi-k2.* -> 1.0 or 0.6; gemini / glm-4.6 / glm-4.7 / minimax-m2 -> 1.0; qwen -> 0.55). Several of these are hard requirements: Moonshot rejects any other value for the kimi-k2 family withHTTP 400 invalid temperature: only 1 is allowed for this model.The call site in
src/session/llm.tsconsulted that function in the wrong order:The built-in
titleagent (src/agent/agent.ts:271) hard-codestemperature: 0.5. Because??short-circuits on the first non-nullish operand, agents 0.5 always won and a temperature of 0.5 was sent to kimi-k2.6, which then 400d.I observed this in production telemetry across multiple users on kimi-k2.6 (the most common case) and the same code path will trip whenever an agent sets a temperature on any of the strict-required model families.
Fix
Swap the operands at the single call site:
The transform function exists precisely because the maintainer encoded model-aware knowledge of what each family expects; that knowledge should beat a generic agent default. For models not covered by the transform map (claude, openai, etc.), the function returns
undefinedand the agents preference still applies through normal??fallthrough — so the only behavior change is on the families the maintainer has already flagged as needing specific values.This also aligns temperature with the adjacent
topK(no agent override) and brings it closer totopP(stillagent ?? transform, but topP has no known strict-required cases today).Diff
Test plan
bun typecheckclean.test/provider/transform.test.tssuite: 220 pass / 0 fail.