Background and motivation
The Microsoft Agent Framework (microsoft/agent-framework) needs native support for async (long-running) tool/function calls — tools where the function dispatches work to an external system and the result arrives later (seconds to hours), rather than being available immediately.
Today, FunctionInvokingChatClient treats all tool invocations as synchronous: invoke the function, get the result, feed it back to the LLM — all within a single GetResponseAsync call. This works for fast functions but breaks for real-world scenarios like (a few examples but not limited to):
- Submitting an order to a fulfillment system (result arrives via webhook minutes later)
- Triggering a CI/CD pipeline (completes asynchronously)
- Requesting a human review or external approval that isn't a simple accept/reject gate
- Calling an external API with a callback pattern
The existing HITL approval pattern (ApprovalRequiredAIFunction / FunctionApprovalRequestContent / FunctionApprovalResponseContent) is architecturally very close to what's needed. It gates before the tool executes (approve/reject), then the framework executes the tool on approval. The async tool pattern is the complement: the tool does execute (dispatching the request), but the final result is not yet available — the caller must re-invoke later with the result.
Tracking issue in agent-framework: microsoft/agent-framework#4265 (link to the issue created in the agent-framework repo)
Additional context
This proposal originates from the Microsoft Agent Framework (microsoft/agent-framework).:
The content types and AsyncAIFunction wrapper proposed here are the foundation that the agent-framework issue builds upon, just as FunctionApprovalRequestContent / ApprovalRequiredAIFunction are the foundation for the HITL approval feature today.
API Proposal
New content types (in Microsoft.Extensions.AI.Abstractions)
namespace Microsoft.Extensions.AI;
/// <summary>
/// Represents a request indicating that a function has been dispatched asynchronously
/// and the final result is not yet available. The caller should provide the result
/// later via <see cref="AsyncFunctionResponseContent"/>.
/// </summary>
[Experimental("MEAI001")]
public class AsyncFunctionRequestContent : AIContent
{
/// <summary>
/// Initializes a new instance of <see cref="AsyncFunctionRequestContent"/>.
/// </summary>
/// <param name="id">The unique identifier for this async request.</param>
/// <param name="functionCall">The function call that was dispatched.</param>
/// <param name="dispatchResult">The result returned by the dispatch invocation (e.g., a tracking/correlation ID).</param>
public AsyncFunctionRequestContent(string id, FunctionCallContent functionCall, object? dispatchResult = null);
/// <summary>Gets the unique identifier for this async function request.</summary>
public string Id { get; }
/// <summary>Gets the function call that was dispatched.</summary>
public FunctionCallContent FunctionCall { get; }
/// <summary>Gets the result of the dispatch invocation (e.g., tracking ID, correlation ID).</summary>
public object? DispatchResult { get; }
/// <summary>
/// Creates an <see cref="AsyncFunctionResponseContent"/> for this request with the provided result.
/// </summary>
/// <param name="result">The final result from the external system.</param>
/// <returns>A new <see cref="AsyncFunctionResponseContent"/> paired to this request.</returns>
public AsyncFunctionResponseContent CreateResponse(object? result);
}
/// <summary>
/// Represents the response to an <see cref="AsyncFunctionRequestContent"/>,
/// providing the final result of the asynchronously dispatched function.
/// </summary>
[Experimental("MEAI001")]
public class AsyncFunctionResponseContent : AIContent
{
/// <summary>
/// Initializes a new instance of <see cref="AsyncFunctionResponseContent"/>.
/// </summary>
/// <param name="id">The unique identifier matching the original <see cref="AsyncFunctionRequestContent.Id"/>.</param>
/// <param name="result">The final result from the external system.</param>
/// <param name="functionCall">The original function call that was dispatched.</param>
public AsyncFunctionResponseContent(string id, object? result, FunctionCallContent functionCall);
/// <summary>Gets the unique identifier matching the original request.</summary>
public string Id { get; }
/// <summary>Gets the final result from the external system.</summary>
public object? Result { get; }
/// <summary>Gets the original function call that was dispatched.</summary>
public FunctionCallContent FunctionCall { get; }
}
New wrapper type (in Microsoft.Extensions.AI)
namespace Microsoft.Extensions.AI;
/// <summary>
/// Wraps an <see cref="AIFunction"/> to indicate it is an async (long-running) function.
/// When <see cref="FunctionInvokingChatClient"/> encounters this wrapper, it will:
/// <list type="number">
/// <item>Execute the inner function (dispatch phase — e.g., sends request to external system)</item>
/// <item>Emit <see cref="AsyncFunctionRequestContent"/> with the dispatch result instead of <see cref="FunctionResultContent"/></item>
/// <item>When <see cref="AsyncFunctionResponseContent"/> is received later, invoke the optional result callback and emit <see cref="FunctionResultContent"/></item>
/// </list>
/// </summary>
[Experimental("MEAI001")]
public class AsyncAIFunction : DelegatingAIFunction
{
/// <summary>
/// Initializes a new instance of <see cref="AsyncAIFunction"/>.
/// </summary>
/// <param name="innerFunction">The function to wrap. Its invocation dispatches the async work.</param>
/// <param name="resultCallback">
/// Optional callback invoked when the async result arrives. Transforms the raw external result
/// before it is provided to the LLM as <see cref="FunctionResultContent"/>.
/// If <see langword="null"/>, the result is forwarded as-is.
/// </param>
public AsyncAIFunction(
AIFunction innerFunction,
Func<object?, CancellationToken, ValueTask<object?>>? resultCallback = null);
/// <summary>Gets the optional callback for transforming async results.</summary>
public Func<object?, CancellationToken, ValueTask<object?>>? ResultCallback { get; }
}
Changes to FunctionInvokingChatClient (in Microsoft.Extensions.AI)
// In FunctionInvokingChatClient's function invocation loop:
// When processing a tool call from the LLM response:
+// If the tool is an AsyncAIFunction:
+// 1. Invoke the inner function (dispatch)
+// 2. Instead of adding FunctionResultContent, add AsyncFunctionRequestContent
+// with the dispatch result and return it to the caller
+//
+// When processing incoming messages containing AsyncFunctionResponseContent:
+// 1. Match by Id to the original AsyncFunctionRequestContent
+// 2. If AsyncAIFunction.ResultCallback is not null, invoke it with the result
+// 3. Add FunctionResultContent with the (optionally transformed) result
+// 4. Continue the LLM conversation loop
API Usage
API Usage
Defining an async tool and using it with an agent
using Microsoft.Extensions.AI;
// 1. Define a function that dispatches async work
[Description("Submits an order to the fulfillment system")]
static async Task<string> SubmitOrder(string orderId, string[] items)
{
// Sends request to external system, returns tracking ID immediately
var trackingId = await fulfillmentClient.SubmitAsync(orderId, items);
return trackingId; // This becomes the DispatchResult
}
// 2. Wrap with AsyncAIFunction (just like ApprovalRequiredAIFunction wraps for approval)
// Optional callback transforms the raw external result before feeding to LLM
var asyncTool = new AsyncAIFunction(
AIFunctionFactory.Create(SubmitOrder),
resultCallback: async (rawResult, ct) =>
{
// Transform the external system's result into something meaningful for the LLM
var orderStatus = JsonSerializer.Deserialize<OrderStatus>(rawResult?.ToString()!);
return $"Order confirmed. Status: {orderStatus.Status}, ETA: {orderStatus.EstimatedDelivery}";
});
// 3. Create agent with async tool
IChatClient chatClient = new ChatClientBuilder(innerClient)
.UseFunctionInvocation()
.Build();
Caller-side: detect async requests, collect results, re-invoke
This mirrors the FunctionApprovalRequestContent loop pattern exactly:
// STEP 1: Send messages to the chat client
List<ChatMessage> messages = [new(ChatRole.User, "Submit order #123 with items A, B, C")];
ChatResponse response = await chatClient.GetResponseAsync(messages);
// STEP 2: Detect async function requests in the response
// (same pattern as detecting FunctionApprovalRequestContent)
List<AsyncFunctionRequestContent> asyncRequests = response.Messages
.SelectMany(m => m.Contents)
.OfType<AsyncFunctionRequestContent>()
.ToList();
// STEP 3: Loop until all async calls are resolved
// (mirrors the approval while-loop from Agent_Step04_UsingFunctionToolsWithApprovals)
while (asyncRequests.Count > 0)
{
List<ChatMessage> asyncResponses = new();
foreach (var asyncRequest in asyncRequests)
{
Console.WriteLine($"Async tool dispatched: {asyncRequest.FunctionCall.Name}");
Console.WriteLine($" Tracking ID: {asyncRequest.DispatchResult}");
// Wait for external system to provide the result
// (poll, webhook, queue, event, etc. — this is application-specific)
string externalResult = await WaitForExternalResult(asyncRequest.DispatchResult);
// Create response — same pattern as approvalRequest.CreateResponse(approved)
asyncResponses.Add(new ChatMessage(ChatRole.Tool,
[asyncRequest.CreateResponse(externalResult)]));
}
// STEP 4: Send responses back (same as sending approval responses)
messages.AddRange(response.Messages);
messages.AddRange(asyncResponses);
response = await chatClient.GetResponseAsync(messages);
// Check for more async requests (LLM may chain multiple async tools)
asyncRequests = response.Messages
.SelectMany(m => m.Contents)
.OfType<AsyncFunctionRequestContent>()
.ToList();
}
Console.WriteLine($"Final response: {response}");
Streaming variant
List<AIContent> pendingAsyncRequests = [];
do
{
pendingAsyncRequests.Clear();
List<ChatResponseUpdate> updates = [];
await foreach (var update in chatClient.GetStreamingResponseAsync(messages))
{
updates.Add(update);
foreach (AIContent content in update.Contents)
{
switch (content)
{
case AsyncFunctionRequestContent asyncRequest:
// Collect for processing after stream completes
pendingAsyncRequests.Add(asyncRequest);
Console.WriteLine($"[Async] {asyncRequest.FunctionCall.Name} dispatched → {asyncRequest.DispatchResult}");
break;
case TextContent textContent:
Console.Write(textContent.Text);
break;
}
}
}
// After streaming completes, wait for external results
for (int i = 0; i < pendingAsyncRequests.Count; i++)
{
var asyncRequest = (AsyncFunctionRequestContent)pendingAsyncRequests[i];
string result = await WaitForExternalResult(asyncRequest.DispatchResult);
messages.Add(new ChatMessage(ChatRole.Tool, [asyncRequest.CreateResponse(result)]));
}
}
while (pendingAsyncRequests.Count > 0);
Side-by-side comparison with existing HITL approval pattern
| Aspect |
HITL Approval (exists today) |
Async Tool (proposed) |
| Wrapper |
new ApprovalRequiredAIFunction(innerFunc) |
new AsyncAIFunction(innerFunc, callback?) |
| On LLM tool request |
Function does NOT execute |
Function DOES execute (dispatch) |
| Content in response |
FunctionApprovalRequestContent |
AsyncFunctionRequestContent |
| Caller creates response |
request.CreateResponse(approved: true) |
request.CreateResponse(resultPayload) |
| Response content |
FunctionApprovalResponseContent |
AsyncFunctionResponseContent |
| Framework on response |
Executes or skips the tool |
Invokes callback, creates FunctionResultContent |
| Caller loop |
Parse requests → user decides → re-invoke |
Parse requests → get results → re-invoke |
Without the callback (simple forwarding)
// When no transformation is needed, omit the callback — result forwards as-is
var asyncTool = new AsyncAIFunction(AIFunctionFactory.Create(SubmitOrder));
Alternative Designs
No response
Risks
No response
Background and motivation
The Microsoft Agent Framework (microsoft/agent-framework) needs native support for async (long-running) tool/function calls — tools where the function dispatches work to an external system and the result arrives later (seconds to hours), rather than being available immediately.
Today,
FunctionInvokingChatClienttreats all tool invocations as synchronous: invoke the function, get the result, feed it back to the LLM — all within a singleGetResponseAsynccall. This works for fast functions but breaks for real-world scenarios like (a few examples but not limited to):The existing HITL approval pattern (
ApprovalRequiredAIFunction/FunctionApprovalRequestContent/FunctionApprovalResponseContent) is architecturally very close to what's needed. It gates before the tool executes (approve/reject), then the framework executes the tool on approval. The async tool pattern is the complement: the tool does execute (dispatching the request), but the final result is not yet available — the caller must re-invoke later with the result.Tracking issue in agent-framework: microsoft/agent-framework#4265 (link to the issue created in the agent-framework repo)
Additional context
This proposal originates from the Microsoft Agent Framework (microsoft/agent-framework).:
The content types and
AsyncAIFunctionwrapper proposed here are the foundation that the agent-framework issue builds upon, just asFunctionApprovalRequestContent/ApprovalRequiredAIFunctionare the foundation for the HITL approval feature today.API Proposal
New content types (in
Microsoft.Extensions.AI.Abstractions)New wrapper type (in
Microsoft.Extensions.AI)Changes to
FunctionInvokingChatClient(inMicrosoft.Extensions.AI)API Usage
API Usage
Defining an async tool and using it with an agent
Caller-side: detect async requests, collect results, re-invoke
This mirrors the
FunctionApprovalRequestContentloop pattern exactly:Streaming variant
Side-by-side comparison with existing HITL approval pattern
new ApprovalRequiredAIFunction(innerFunc)new AsyncAIFunction(innerFunc, callback?)FunctionApprovalRequestContentAsyncFunctionRequestContentrequest.CreateResponse(approved: true)request.CreateResponse(resultPayload)FunctionApprovalResponseContentAsyncFunctionResponseContentFunctionResultContentWithout the callback (simple forwarding)
Alternative Designs
No response
Risks
No response