The AO Toolkit collects telemetry from your infrastructure and reports to AO
Cloud, which maps and learns it. This MCP endpoint serves the enriched context
to your agents. Pick your client below.
Endpoint
https://mcp.automatedoperations.com
Same URL for every client. Streamable HTTP transport. SSE supported for
clients that require it.
Auth
Authorization: Bearer $AO_TOKEN
Tokens are scoped per agent and per environment. Provisioned during
onboarding.
import os
from anthropic import Anthropic
client = Anthropic()
res = client.beta.messages.create(
model="claude-sonnet-4-6",
max_tokens=1024,
messages=[{"role": "user", "content": "List failing pods in production."}],
mcp_servers=[
{
"type": "url",
"url": "https://mcp.automatedoperations.com",
"name": "automated-operations",
"authorization_token": os.environ["AO_TOKEN"],
}
],
)
OpenAI Agents / SDK
Tool-calling shim for non-MCP-native models.
python · Agents SDK
import os
from agents import Agent, Runner
from agents.mcp import MCPServerStreamableHttp
ao = MCPServerStreamableHttp(
name="automated-operations",
params={
"url": "https://mcp.automatedoperations.com",
"headers": {"Authorization": f"Bearer {os.environ['AO_TOKEN']}"},
},
)
agent = Agent(
name="ops",
instructions="You are an SRE. Use AO tools to investigate.",
mcp_servers=[ao],
)
result = await Runner.run(agent, "Why is checkout p99 latency up?")
print(result.final_output)
Don't see your client?
If your tool speaks MCP over streamable HTTP or SSE, the configuration
above will work — just rename the keys to match your client's schema. If
it doesn't, we maintain shims for OpenAI tool-calling and Google Vertex
tool-use that wrap the same backend. Email us with your client and we
will help you wire it up the same day.