Description
Calling a Prompt Agent (created in the Foundry portal) via FoundryAgent fails on agent-framework 1.2.0, 1.2.1, and 1.2.2 with a 400 from the Responses API:
Missing required parameter: 'model'.
The same code works on 1.1.1. The agent is a Prompt Agent (kind: prompt), version 2, created in the Foundry portal. agent_version="2" is passed explicitly.
The Responses API should derive the model from the agent_reference server-side, so model should not be required. It looks like the request being built no longer attaches agent_reference to extra_body (or attaches it incorrectly) for the Prompt Agent path. Suspect PR #5447 ("Update FoundryAgent for hosted agent sessions") introduced the regression — the stack trace goes through agent_framework_openai._chat_client._get_response, which doesn't know about agent_reference on its own.
Versions
agent-framework: 1.2.2 (also reproduces on 1.2.1 and 1.2.0)
agent-framework-core: 1.2.2
agent-framework-foundry: 1.2.2
- Python: [e.g. 3.13.x]
- OS: Windows [version]
Last known good: 1.1.1.
Reproduction
import os, sys, asyncio
from dotenv import load_dotenv, find_dotenv
from agent_framework.foundry import FoundryAgent
from azure.identity import ClientSecretCredential
load_dotenv(find_dotenv(), override=True)
endpoint = os.getenv("FOUNDRY_PROJECT_ENDPOINT")
credential = ClientSecretCredential(
tenant_id=os.getenv("AZURE_TENANT_ID"),
client_id=os.getenv("AZURE_CLIENT_ID"),
client_secret=os.getenv("AZURE_CLIENT_SECRET"),
)
async def main():
agent = FoundryAgent(
project_endpoint=endpoint,
agent_name="TestAgent",
agent_version="2",
credential=credential,
)
result = await agent.run("What is the capital of France?")
print(result)
asyncio.run(main())
Description
Calling a Prompt Agent (created in the Foundry portal) via
FoundryAgentfails on agent-framework 1.2.0, 1.2.1, and 1.2.2 with a 400 from the Responses API:The same code works on 1.1.1. The agent is a Prompt Agent (kind: prompt), version 2, created in the Foundry portal.
agent_version="2"is passed explicitly.The Responses API should derive the model from the
agent_referenceserver-side, somodelshould not be required. It looks like the request being built no longer attachesagent_referencetoextra_body(or attaches it incorrectly) for the Prompt Agent path. Suspect PR #5447 ("Update FoundryAgent for hosted agent sessions") introduced the regression — the stack trace goes throughagent_framework_openai._chat_client._get_response, which doesn't know about agent_reference on its own.Versions
agent-framework: 1.2.2 (also reproduces on 1.2.1 and 1.2.0)agent-framework-core: 1.2.2agent-framework-foundry: 1.2.2Last known good: 1.1.1.
Reproduction