Python: added generic types to ChatOptions and ChatResponse/AgentResponse for Response Format#3305
Merged
eavanvalkenburg merged 7 commits intomicrosoft:mainfrom Jan 28, 2026
Conversation
Contributor
There was a problem hiding this comment.
Pull request overview
This PR enhances type safety for structured outputs by adding generic type parameters to ChatOptions, ChatResponse, and AgentResponse. When using response_format with a Pydantic model, the type system now correctly infers the return type, improving IDE autocomplete and type checking.
Changes:
- Added generic type parameters (
TResponseModel) toChatOptions,ChatResponse, andAgentResponseclasses - Updated all provider-specific ChatOptions implementations (OpenAI, Azure, Anthropic, Bedrock, Ollama, Foundry Local, AG-UI) to support generics
- Added overloads to
get_response()andrun()methods to properly infer types whenresponse_formatis specified - Updated sample code to use the simpler
.valueproperty instead oftry_parse_value() - Cleaned up unused code in
OpenAIAssistantsClient(removed duplicateas_agentimplementation and unused imports)
Reviewed changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
_types.py |
Added TResponseModel generic parameter to ChatOptions, ChatResponse, and AgentResponse; added overloads for try_parse_value, from_chat_response_generator, and from_agent_response_generator |
_clients.py |
Added overloads to get_response() for type inference; updated as_agent() signature to accept generic options |
_agents.py |
Added overloads to run() for type inference; updated to properly set response_format on AgentResponse |
openai/_chat_client.py |
Made OpenAIChatOptions generic over TResponseModel |
openai/_responses_client.py |
Made OpenAIResponsesOptions generic over TResponseFormat |
openai/_assistants_client.py |
Made OpenAIAssistantsOptions generic; removed duplicate as_agent override and cleaned up unused imports |
azure/_chat_client.py |
Made AzureOpenAIChatOptions generic over TResponseModel |
anthropic/_chat_client.py |
Made AnthropicChatOptions generic over TResponseModel |
bedrock/_chat_client.py |
Made BedrockChatOptions generic over TResponseModel |
ollama/_chat_client.py |
Made OllamaChatOptions generic over TResponseModel |
foundry_local/_foundry_local_client.py |
Made FoundryLocalChatOptions generic over TResponseModel |
ag_ui/_types.py |
Made AGUIChatOptions generic over TResponseModel |
ag_ui/_client.py |
Added cast to ensure return type compatibility |
openai_responses_client_with_structured_output.py |
Updated sample to use .value property instead of try_parse_value() |
af11dff to
218a001
Compare
Member
moonbox3
reviewed
Jan 22, 2026
f2d8e4d to
e62c24f
Compare
11a02e8 to
aff4af1
Compare
giles17
approved these changes
Jan 28, 2026
TaoChenOSU
reviewed
Jan 28, 2026
TaoChenOSU
approved these changes
Jan 28, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Motivation and Context
Adds a Generic setup for ChatResponse/AgentResponse and the run and get_response methods.
This now is typed correctly:
This does not work for the type inference of
AgentResponsein the sample below, because I did not wantagentto either have this always becomeChatAgent[ChatOptions[OutputModel]]or have to add overloads to every client and explicit options settings when creating ChatAgent directly:It also works for:
Fixes #3091
Description
Contribution Checklist