Summary
When using FoundryChatClient.get_mcp_tool() to attach a Foundry MCP server that requires OAuth (Work IQ User / mcp_MeServer) to an agent, the tool works correctly locally via agent-framework directly, but fails with an OBO token error when the same agent is deployed as a hosted agent on Azure AI Foundry Agent Service.
Environment
| Package |
Version |
azure-ai-agentserver-responses |
1.0.0b5 |
agent-framework |
1.2.2 |
agent-framework-openai |
1.2.2 |
agent-framework-foundry |
1.2.2 |
agent-framework-foundry-hosting |
≥1.0.0a260429 |
azure-ai-projects |
≥2.1.0 |
azure-identity |
≥1.26.0b1 |
| Python |
3.12 |
| Container base |
python:3.12-slim |
| Foundry protocol |
Responses API v1.0.0 |
Error
When the deployed agent invokes the MCP tool, the following exception is raised:
agent_framework.exceptions.ChatClientException: (
'<class \'agent_framework_foundry._chat_client.FoundryChatClient\'> service failed to complete the prompt:
Failed to fetch access token. Status: BadRequest.
Details: "ARA OBO token request failed with status BadRequest"',
APIError('Failed to fetch access token. Status: BadRequest.
Details: "ARA OBO token request failed with status BadRequest"')
)
Reproduction
Agent setup (simplified)
# agent.py
from azure.identity.aio import DefaultAzureCredential
from agent_framework import Agent
from agent_framework.foundry import FoundryChatClient
def create_orchestrator_agent() -> Agent:
credential = DefaultAzureCredential()
client = FoundryChatClient(
project_endpoint="https://<foundry-project-endpoint>",
model="<model-deployment-name>",
credential=credential,
)
mcp_tool = FoundryChatClient.get_mcp_tool(
name="WorkIQUser",
url="https://agent365.svc.cloud.microsoft/agents/servers/mcp_MeServer",
project_connection_id="<mcp-connection-id>",
approval_mode="never_require",
)
return Agent(
client=client,
name="MyAgent",
instructions="You are a helpful assistant. Use the MCP tool to get the user's profile.",
tools=[mcp_tool],
)
Container entrypoint (simplified)
# container.py
import asyncio
from agent_framework_foundry_hosting import ResponsesHostServer
from agent import create_orchestrator_agent
async def main():
orchestrator = create_orchestrator_agent()
server = ResponsesHostServer(orchestrator)
await server.run_async()
if __name__ == "__main__":
asyncio.run(main())
Hosted agent deployment
from azure.ai.projects import AIProjectClient
from azure.ai.projects.models import (
HostedAgentDefinition,
ProtocolVersionRecord,
AgentProtocol,
)
client = AIProjectClient(
endpoint="<foundry-project-endpoint>",
credential=credential,
allow_preview=True,
)
agent = client.agents.create_version(
agent_name="my-agent",
definition=HostedAgentDefinition(
container_protocol_versions=[
ProtocolVersionRecord(protocol=AgentProtocol.RESPONSES, version="1.0.0"),
],
image="<acr>.azurecr.io/my-agent:<tag>",
environment_variables={...},
tools=[
{
"type": "mcp",
"server_label": "WorkIQUser",
"server_url": "https://agent365.svc.cloud.microsoft/agents/servers/mcp_MeServer",
"project_connection_id": "<mcp-connection-id>",
}
],
),
)
Invocation (triggers the error)
openai_client = client.get_openai_client(agent_name="my-agent")
# This call triggers the MCP tool, which fails with OBO error
response = openai_client.responses.create(
input=[{"type": "message", "role": "user", "content": "What is my email?"}],
)
Works locally
Running the same agent directly via agent-framework locally works without error:
# interactive.py or notebook
orchestrator = create_orchestrator_agent()
session = orchestrator.create_session()
# MCP tool call succeeds — DefaultAzureCredential resolves to user's Azure CLI token
result = await orchestrator.run("What is my email?", session=session)
print(result.text) # ✓ Returns user profile from Graph API
Expected behavior
The MCP tool should successfully acquire an OBO token and call the MCP server (mcp_MeServer) to retrieve the user's Graph API profile, the same way it works when running agent-framework locally.
Questions
- What OAuth/app registration configuration is required for hosted agent MCP tools to perform OBO token exchange successfully?
- Is there a known limitation with OBO token exchange for MCP tools in hosted agents?
Summary
When using
FoundryChatClient.get_mcp_tool()to attach a Foundry MCP server that requires OAuth (Work IQ User /mcp_MeServer) to an agent, the tool works correctly locally viaagent-frameworkdirectly, but fails with an OBO token error when the same agent is deployed as a hosted agent on Azure AI Foundry Agent Service.Environment
azure-ai-agentserver-responsesagent-frameworkagent-framework-openaiagent-framework-foundryagent-framework-foundry-hostingazure-ai-projectsazure-identitypython:3.12-slimError
When the deployed agent invokes the MCP tool, the following exception is raised:
Reproduction
Agent setup (simplified)
Container entrypoint (simplified)
Hosted agent deployment
Invocation (triggers the error)
Works locally
Running the same agent directly via
agent-frameworklocally works without error:Expected behavior
The MCP tool should successfully acquire an OBO token and call the MCP server (
mcp_MeServer) to retrieve the user's Graph API profile, the same way it works when running agent-framework locally.Questions