use elixir 1.17#427
Merged
brainlid merged 1 commit intobrainlid:mainfrom Jan 23, 2026
Merged
Conversation
the ChatGoogleAI module uses the `get_in` macro which is only available in Elixir 1.17 onwards. Using elixir 1.16 throws a compile error.
761224f to
91d3317
Compare
Owner
|
Thanks @nbw! Appreciated! |
brainlid
added a commit
that referenced
this pull request
Jan 24, 2026
* main: formatting fix "Support reasoning_content of deepseek model" introducing UI bug for deepseek-chat model (#429) Add new reasoning effort values to ChatOpenAIResponses (#419) Add support for OpenAI reasoning/thinking events in ChatOpenAIResponses (#421) Add verbose_api field to ChatPerplexity and ChatMistralAI (#416) ChatMistralAI: Support thinking content parts (#418) Don't include top_p for gpt-5.2+ in ChatOpenAIResponses (#428) feat(GoogleChatAI): add thought_signature support for Gemini 3 function calls (#431) elixir 1.17 (#427) feat(ChatMistralAI): add support for parallel tool calls (#433) fix: missing error handling and fallback mechanism on server outages (#435) Fixes image file_id content type for ChatOpenAIResponses (#438) Add support for OpenAI Response API Stateful context (#425) Add thinking config to vertex ai (#423) Add req_config to ChatOpenAIResponses (#415)
brainlid
added a commit
that referenced
this pull request
Jan 24, 2026
* main: Support json schema in vertex ai (#424) formatting fix "Support reasoning_content of deepseek model" introducing UI bug for deepseek-chat model (#429) Add new reasoning effort values to ChatOpenAIResponses (#419) Add support for OpenAI reasoning/thinking events in ChatOpenAIResponses (#421) Add verbose_api field to ChatPerplexity and ChatMistralAI (#416) ChatMistralAI: Support thinking content parts (#418) Don't include top_p for gpt-5.2+ in ChatOpenAIResponses (#428) feat(GoogleChatAI): add thought_signature support for Gemini 3 function calls (#431) elixir 1.17 (#427) feat(ChatMistralAI): add support for parallel tool calls (#433) fix: missing error handling and fallback mechanism on server outages (#435) Fixes image file_id content type for ChatOpenAIResponses (#438) Add support for OpenAI Response API Stateful context (#425) Add thinking config to vertex ai (#423) Add req_config to ChatOpenAIResponses (#415)
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
the ChatGoogleAI module uses the
get_inmacro which is only available in Elixir 1.17 onwards. Using elixir 1.16 throws a compile error.Context
I ran into this issue at compile time since I was using 1.16.