Skip to content

Conversation

SergeyMenshykh
Copy link

@SergeyMenshykh SergeyMenshykh commented Sep 25, 2025

This PR adds a model for supporting background responses (long-running operations), updates the chat completion model to use it, and integrates this functionality into OpenAIResponsesChatClient.

Background responses use a continuation token returned by the GetResponseAsync and GetStreamingResponseAsync methods in the ChatResponse and ChatResponseUpdate classes, respectively, when background responses are enabled and supported by the chat client.

The continuation token contains all necessary details to enable polling for a background response using the non-streaming GetResponseAsync method. It also allows resuming a streamed background response with the GetStreamingResponseAsync method if the stream is interrupted. In both cases, the continuation token obtained from the initial chat result or the streamed update received before the interruption should be supplied as input for follow-up calls.

When a background response has completed, failed, or cannot proceed further (for example, when user input is required), the continuation token returned by either method will be null, signaling to consumers that processing is complete and there is nothing to poll or resume. The result returned by either method can then be used for further processing.

Non-streaming API:

var chatOptions = new ChatOptions
{
    BackgroundResponsesOptions = new() { Allow = true }
};

// Get initial response with continuation token
var response = await chatClient.GetResponseAsync("What's the biggest animal?", chatOptions);

// Continue to poll until the final response is received
while (response.ContinuationToken is not null)
{
    response = await chatClient.GetResponseAsync(response.ContinuationToken, chatOptions);
}

Console.WriteLine(response);

Streaming API:

ChatOptions chatOptions = new()
{
    BackgroundResponsesOptions = new BackgroundResponsesOptions { Allow = true },
};

string response = "";
ResumptionToken? continuationToken = null;

await foreach (var update in chatClient.GetStreamingResponseAsync("What is the capital of France?", chatOptions))
{
    response += update;

    // Simulate an interruption
    continuationToken = update.ContinuationToken;
    break;
}

// Resume streaming from the point captured by the continuation token
await foreach (var update in chatClient.GetStreamingResponseAsync(continuationToken!, chatOptions))
{
    response += update;
}

Console.WriteLine(response);
Microsoft Reviewers: Open in CodeFlow

CC: @westey-m

@github-actions github-actions bot added the area-ai Microsoft.Extensions.AI libraries label Sep 25, 2025
@SergeyMenshykh
Copy link
Author

@dotnet-policy-service agree company="Microsoft"

@SergeyMenshykh SergeyMenshykh removed their assignment Sep 26, 2025
@SergeyMenshykh SergeyMenshykh marked this pull request as ready for review September 26, 2025 17:40
@SergeyMenshykh SergeyMenshykh requested a review from a team as a code owner September 26, 2025 17:40
@Copilot Copilot AI review requested due to automatic review settings September 26, 2025 17:40
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR introduces support for background responses (long-running operations) in the AI chat framework. The feature enables clients to initiate operations that can be polled or resumed from interruption points using continuation tokens.

  • Adds ResumptionToken base class and BackgroundResponsesOptions for continuation token support
  • Updates ChatOptions, ChatResponse, and ChatResponseUpdate with background response properties
  • Implements background response functionality in OpenAIResponsesChatClient with polling and stream resumption

Reviewed Changes

Copilot reviewed 23 out of 23 changed files in this pull request and generated 3 comments.

Show a summary per file
File Description
ResumptionToken.cs Introduces base class for continuation tokens with byte serialization
BackgroundResponsesOptions.cs Defines options for enabling background responses
ChatOptions.cs Adds continuation token and background response options properties
ChatResponse.cs Adds continuation token property for background response polling
ChatResponseUpdate.cs Adds continuation token property for stream resumption
ChatClientExtensions.cs Provides extension methods for background response operations
OpenAIResponsesContinuationToken.cs Implements OpenAI-specific continuation token with JSON serialization
OpenAIResponsesChatClient.cs Core implementation of background response support with polling and streaming
FunctionInvokingChatClient.cs Updates to handle continuation tokens in function calling scenarios

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

/// This property only takes effect if the API it's used with supports background responses.
/// If the API does not support background responses, this property will be ignored.
/// </remarks>
public bool? Allow { get; set; }
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should it be Enabled instead of Allow?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It could be, though there is an opinion that it may mislead consumers into thinking that setting that property should enable or disable background responses. In reality, it may not have any effect because the chat client implementation may not support background responses, or the underlying SDK does not allow for controlling the behavior. Allow is more permissive than Enable and is intended to communicate that it can enable or disable the feature if supported; otherwise, it will have no effect. On the other hand, the Enable semantic presumes that the values should be respected once set.

public static Task<ChatResponse> GetResponseAsync(
this IChatClient client,
ResumptionToken continuationToken,
ChatOptions? options = null,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the purpose of the options being accepted here?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good question. We need to supply functions for subsequent calls if they were advertised in the initial call; otherwise, the called tools won't be found by FICC.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area-ai Microsoft.Extensions.AI libraries
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants