Description
When using FunctionInvokingChatClient with OpenRouter's Gemini 3 Preview models (google/gemini-3-flash-preview, google/gemini-3-pro-preview), tool call results fail with HTTP 400 Bad Request.
The root cause is that OpenRouter requires reasoning_details to be preserved and passed back when sending tool results to Gemini 3 reasoning models. The FunctionInvokingChatClient doesn't preserve this field when constructing the tool result message.
From OpenRouter's documentation:
Reasoning Details must be preserved when using multi-turn tool calling.
Reproduction Steps
Save and run this single-file .NET 10 script:
#!/usr/bin/env dotnet run
#:package Microsoft.Extensions.AI@10.1.1-preview.1.25612.2
#:package Microsoft.Extensions.AI.OpenAI@10.1.1-preview.1.25612.2
#:package OpenAI@2.8.0
using System.ClientModel;
using System.ComponentModel;
using Microsoft.Extensions.AI;
using OpenAI;
var apiKey = Environment.GetEnvironmentVariable("OPENROUTER_API_KEY")
?? throw new InvalidOperationException("Set OPENROUTER_API_KEY environment variable");
var model = "google/gemini-3-flash-preview"; // FAILS with 400
// var model = "google/gemini-2.5-flash"; // WORKS
var options = new OpenAIClientOptions { Endpoint = new Uri("https://openrouter.ai/api/v1") };
var client = new OpenAI.Chat.ChatClient(model, new ApiKeyCredential(apiKey), options)
.AsIChatClient()
.AsBuilder()
.UseFunctionInvocation()
.Build();
var chatOptions = new ChatOptions { Tools = [AIFunctionFactory.Create(GetCurrentTime)] };
var messages = new List<ChatMessage> { new(ChatRole.User, "What time is it?") };
var response = await client.GetResponseAsync(messages, chatOptions);
Console.WriteLine(response.Text);
[Description("Gets the current time")]
static string GetCurrentTime() => DateTime.Now.ToString("HH:mm:ss");
Run with:
export OPENROUTER_API_KEY=your_key_here
dotnet run GeminiToolCallRepro.cs
Expected behavior
The tool call completes successfully and returns the time.
Actual behavior
ERROR: 400 - Service request failed.
Status: 400 (Bad Request)
The error occurs after:
- Initial request sent with tool definition
- Model responds with tool call request (including
reasoning_details)
FunctionInvokingChatClient executes the tool and sends result back
- 400 error because
reasoning_details wasn't preserved in the follow-up request
Configuration
- .NET 10.0 Preview
- Microsoft.Extensions.AI 10.1.1
- Microsoft.Extensions.AI.OpenAI 10.1.1-preview.1.25612.2
- OpenAI 2.8.0
- OpenRouter API (OpenAI-compatible endpoint)
- Models affected:
google/gemini-3-flash-preview, google/gemini-3-pro-preview
Related
Description
When using
FunctionInvokingChatClientwith OpenRouter's Gemini 3 Preview models (google/gemini-3-flash-preview,google/gemini-3-pro-preview), tool call results fail with HTTP 400 Bad Request.The root cause is that OpenRouter requires
reasoning_detailsto be preserved and passed back when sending tool results to Gemini 3 reasoning models. TheFunctionInvokingChatClientdoesn't preserve this field when constructing the tool result message.From OpenRouter's documentation:
Reproduction Steps
Save and run this single-file .NET 10 script:
Run with:
export OPENROUTER_API_KEY=your_key_here dotnet run GeminiToolCallRepro.csExpected behavior
The tool call completes successfully and returns the time.
Actual behavior
The error occurs after:
reasoning_details)FunctionInvokingChatClientexecutes the tool and sends result backreasoning_detailswasn't preserved in the follow-up requestConfiguration
google/gemini-3-flash-preview,google/gemini-3-pro-previewRelated
TextReasoningContent.ProtectedDatafor storing reasoning data, butFunctionInvokingChatClientdoesn't automatically preserve it during tool call loops