Skip to content

fix(llma): extract model from response for OpenAI stored prompts (#395) #23

fix(llma): extract model from response for OpenAI stored prompts (#395)

fix(llma): extract model from response for OpenAI stored prompts (#395) #23

Triggered via push December 21, 2025 21:17
Status Failure
Total duration 35s
Artifacts

release.yml

on: push
Publish release
31s
Publish release
Fit to window
Zoom out
Zoom in

Annotations

1 error and 1 warning
Publish release
Parameter token or opts.auth is required
Publish release
Could not find required-version under [tool.uv] in pyproject.toml. Falling back to latest