Bug Description
I have not encountered this error before regarding formatting and I could not find anything related on the repos, can someone investigate this?
File "/app/.venv/lib/python3.11/site-packages/livekit/agents/inference/llm.py", line 341, in _run\n raise APIStatusError(\nlivekit.agents._exceptions.APIStatusError: provider: openai model: gpt-4.1, message: POST "https://api.openai.com/v1/chat/completions\": 400 Bad Request {\n "message": "Missing required parameter: 'response_format.json_schema'.",\n "type": "invalid_request_error",\n "param": "response_format.json_schema",\n "code": "missing_required_parameter"\n }.
Expected Behavior
STT -> LLM -> TTS should work as expected.
Reproduction Steps
I think if you use openAI in the LLM node should encounter this.
Operating System
macOS Tahoe
Models Used
OpenAI GPT4.1 for LLM
Package Versions
Session/Room/Call IDs
No response
Proposed Solution
Additional Context
No response
Screenshots and Recordings
No response
Bug Description
I have not encountered this error before regarding formatting and I could not find anything related on the repos, can someone investigate this?
File "/app/.venv/lib/python3.11/site-packages/livekit/agents/inference/llm.py", line 341, in _run\n raise APIStatusError(\nlivekit.agents._exceptions.APIStatusError: provider: openai model: gpt-4.1, message: POST "https://api.openai.com/v1/chat/completions\": 400 Bad Request {\n "message": "Missing required parameter: 'response_format.json_schema'.",\n "type": "invalid_request_error",\n "param": "response_format.json_schema",\n "code": "missing_required_parameter"\n }.
Expected Behavior
STT -> LLM -> TTS should work as expected.
Reproduction Steps
Operating System
macOS Tahoe
Models Used
OpenAI GPT4.1 for LLM
Package Versions
Session/Room/Call IDs
No response
Proposed Solution
Additional Context
No response
Screenshots and Recordings
No response