Skip to content

[bug] Fix handling of temperature argument #1353

@CalebCourier

Description

@CalebCourier

Describe the bug
The defaulting of the temperature param yields the below error:

litellm.UnsupportedParamsError: gpt-5 models (including gpt-5-codex) don't support temperature=0. Only temperature=1 is supported. To drop unsupported params set litellm.drop_params = True

To Reproduce
Use any gpt-5 model like below:

from guardrails import Guard

guard = Guard()

res = guard(
    model="gpt-5-nano",
    messages=[{
        "role": "user",
        "content": "Hello, world!"
    }]
)

print(res)

Expected behavior
Users should be able to use gpt-5-* models without explicitly setting temperature to 1

Library version:
Version (e.g. 0.1.5)
0.6.7

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions