Skip to content

Potential Breaking Change? #4588

@bchau-calliope

Description

@bchau-calliope

Bug Description

I have not encountered this error before regarding formatting and I could not find anything related on the repos, can someone investigate this?

File "/app/.venv/lib/python3.11/site-packages/livekit/agents/inference/llm.py", line 341, in _run\n raise APIStatusError(\nlivekit.agents._exceptions.APIStatusError: provider: openai model: gpt-4.1, message: POST "https://api.openai.com/v1/chat/completions\": 400 Bad Request {\n "message": "Missing required parameter: 'response_format.json_schema'.",\n "type": "invalid_request_error",\n "param": "response_format.json_schema",\n "code": "missing_required_parameter"\n }.

Expected Behavior

STT -> LLM -> TTS should work as expected.

Reproduction Steps

I think if you use openAI in the LLM node should encounter this.

Operating System

macOS Tahoe

Models Used

OpenAI GPT4.1 for LLM

Package Versions

all 1.3.10

Session/Room/Call IDs

No response

Proposed Solution

Additional Context

No response

Screenshots and Recordings

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions