Skip to content

Python: [Bug]: ResponsesHostServer drops oauth_consent_request and function_approval_request from toolbox MCP connectors #5535

@XiaofuHuang

Description

@XiaofuHuang

Description

When using ResponsesHostServer with a Foundry toolbox containing an MCP connector that requires OAuth consent or tool approval, the server returns empty responses (output: []). The consent link and approval prompt are silently dropped.

Steps to Reproduce

  • Create a Foundry toolbox with an OAuth-protected MCP connector (e.g., Vercel MCP)
  • Use ResponsesHostServer with the toolbox
  • Send a streaming request — the response completes with output: []

Expected Behavior

The SSE stream should include the oauth_consent_request with the consent link, or the mcp_approval_request so the client can act on them.

Actual Behavior

Both content types are silently dropped with a warning log
WARNING:agent_framework_foundry_hosting._responses:Content type 'oauth_consent_request' is not supported yet. This is usually safe to ignore.

Root Cause

_to_outputs() in agent_framework_foundry_hosting has no handler for oauth_consent_request or function_approval_request content types — they fall through to the else warning branch.

Note: PR #5070 correctly fixed the streaming parser so these content types are now surfaced by the agent framework. The gap is only in the hosting layer's SSE emission.

The input history path _output_item_to_message() already handles oauth_consent_request, but that only runs when loading conversation history — not when emitting the current response.

Code Sample

import asyncio
import os

from agent_framework import Agent
from agent_framework.foundry import FoundryChatClient
from agent_framework_foundry_hosting import ResponsesHostServer
from azure.identity import DefaultAzureCredential
from dotenv import load_dotenv

# Load environment variables from .env file
load_dotenv()


def _patch_hosting_oauth_consent():
    """Patch _to_outputs to emit oauth_consent_request as an SSE event.

    PR #5070 (merged into agent-framework-foundry 1.2.0) fixed the streaming
    chunk parser so oauth_consent_request content is now surfaced from the
    Responses API. However, the ResponsesHostServer's _to_outputs function
    still drops it because there is no handler for this content type.
    This patch adds the missing branch.
    """
    import agent_framework_foundry_hosting._responses as hosting_mod

    _orig_to_outputs = hosting_mod._to_outputs

    async def _patched_to_outputs(stream, content):
        if content.type == "oauth_consent_request" and content.consent_link:
            label = (content.additional_properties or {}).get("server_label", "")
            text = (
                f"OAuth consent required for MCP server '{label}'. "
                f"Please visit:\n{content.consent_link}"
            )
            async for event in stream.aoutput_item_message(text):
                yield event
        else:
            async for event in _orig_to_outputs(stream, content):
                yield event

    hosting_mod._to_outputs = _patched_to_outputs


_patch_hosting_oauth_consent()


async def main():
    client = FoundryChatClient(
        project_endpoint=os.environ["FOUNDRY_PROJECT_ENDPOINT"],
        model=os.environ["AZURE_AI_MODEL_DEPLOYMENT_NAME"],
        credential=DefaultAzureCredential(),
    )

    # Load the named toolbox from the Foundry project. Omitting `version`
    # resolves the toolbox's current default version at runtime.
    toolbox = await client.get_toolbox(os.environ["TOOLBOX_NAME"])

    # Disable approval prompts for MCP tools — the hosting framework does
    # not yet support the function_approval_request content type.
    for tool in toolbox["tools"]:
        if tool.get("type") == "mcp":
            tool["require_approval"] = "never"

    agent = Agent(
        client=client,
        instructions="You are a friendly assistant. Keep your answers brief.",
        tools=toolbox,
        # History will be managed by the hosting infrastructure, thus there
        # is no need to store history by the service. Learn more at:
        # https://developers.openai.com/api/reference/resources/responses/methods/create
        default_options={"store": False},
    )

    server = ResponsesHostServer(agent)
    await server.run_async()


if __name__ == "__main__":
    asyncio.run(main())

Error Messages / Stack Traces

Package Versions

agent-framework-core 1.2.0, agent-framework-foundry 1.2.0, agent-framework-foundry-hosting 1.0.0a260424, azure-ai-agentserver-responses 1.0.0b5

Python Version

Python 3.12

Additional Context

No response

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingpython

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions