You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix(llma): extract model from response for OpenAI stored prompts (#395)
* fix: extract model from response for OpenAI stored prompts
When using OpenAI stored prompts, the model is defined in the OpenAI
dashboard rather than passed in the API request. This change adds a
fallback to extract the model from the response object when not
provided in kwargs.
FixesPostHog/posthog#42861
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* Apply suggestion from @greptile-apps[bot]
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
* Apply suggestion from @greptile-apps[bot]
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
* test: add tests for model extraction fallback and bump to 7.4.1
- Add 8 tests covering model extraction from response for stored prompts
- Fix utils.py to add 'unknown' fallback for consistency
- Bump version to 7.4.1
- Update CHANGELOG.md
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* style: format utils.py with ruff
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* fix: remove 'unknown' fallback from non-streaming to match original behavior
Non-streaming originally returned None when model wasn't in kwargs.
Streaming keeps "unknown" fallback as that was the original behavior.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
* test: add test for None model fallback in non-streaming
Verifies that non-streaming returns None (not "unknown") when model
is not available in kwargs or response, matching original behavior.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
---------
Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
Co-authored-by: greptile-apps[bot] <165735046+greptile-apps[bot]@users.noreply.github.com>
Copy file name to clipboardExpand all lines: CHANGELOG.md
+6Lines changed: 6 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,3 +1,9 @@
1
+
# 7.4.1 - 2025-12-19
2
+
3
+
fix: extract model from response for OpenAI stored prompts
4
+
5
+
When using OpenAI stored prompts, the model is defined in the OpenAI dashboard rather than passed in the API request. This fix adds a fallback to extract the model from the response object when not provided in kwargs, ensuring generations show up with the correct model and enabling cost calculations.
6
+
1
7
# 7.4.0 - 2025-12-16
2
8
3
9
feat: Add automatic retries for feature flag requests
0 commit comments