Automated Operations Get a token

MCP integration guide

Point any agent at one endpoint

The AO Toolkit collects telemetry from your infrastructure and reports to AO Cloud, which maps and learns it. This MCP endpoint serves the enriched context to your agents. Pick your client below.

Endpoint

https://mcp.automatedoperations.com

Same URL for every client. Streamable HTTP transport. SSE supported for clients that require it.

Auth

Authorization: Bearer $AO_TOKEN

Tokens are scoped per agent and per environment. Provisioned during onboarding.

Quick connectivity test

bash
curl -X POST https://mcp.automatedoperations.com \
  -H "Authorization: Bearer $AO_TOKEN" \
  -H "Content-Type: application/json" \
  -d '{"jsonrpc":"2.0","id":1,"method":"tools/list"}'

Claude Code

Anthropic's terminal-native coding agent.

bash · CLI (recommended)
claude mcp add automated-operations \
  --transport http \
  --url https://mcp.automatedoperations.com \
  --header "Authorization: Bearer $AO_TOKEN"
json · .mcp.json (per-project)
{
  "mcpServers": {
    "automated-operations": {
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer $AO_TOKEN" }
    }
  }
}

Claude Desktop

Anthropic desktop app for Mac and Windows.

json · ~/Library/Application Support/Claude/claude_desktop_config.json
{
  "mcpServers": {
    "automated-operations": {
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer $AO_TOKEN" }
    }
  }
}

Windows path: %APPDATA%\Claude\claude_desktop_config.json

Cursor

AI-first IDE.

json · ~/.cursor/mcp.json (or .cursor/mcp.json per-project)
{
  "mcpServers": {
    "automated-operations": {
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer $AO_TOKEN" }
    }
  }
}

Windsurf

Codeium's agentic IDE.

json · ~/.codeium/windsurf/mcp_config.json
{
  "mcpServers": {
    "automated-operations": {
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer $AO_TOKEN" }
    }
  }
}

Codex CLI

OpenAI's coding CLI.

toml · ~/.codex/config.toml
[mcp_servers.automated-operations]
url = "https://mcp.automatedoperations.com"
transport = "http"

[mcp_servers.automated-operations.headers]
Authorization = "Bearer ${AO_TOKEN}"

Zed

High-performance collaborative editor.

json · ~/.config/zed/settings.json
{
  "context_servers": {
    "automated-operations": {
      "source": "custom",
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer $AO_TOKEN" }
    }
  }
}

VS Code

GitHub Copilot Chat / agent mode.

json · .vscode/mcp.json
{
  "servers": {
    "automated-operations": {
      "type": "http",
      "url": "https://mcp.automatedoperations.com",
      "headers": { "Authorization": "Bearer ${input:ao_token}" }
    }
  },
  "inputs": [
    { "id": "ao_token", "type": "promptString", "description": "AO token", "password": true }
  ]
}

LiteLLM

Proxy and gateway for 100+ LLMs.

yaml · config.yaml (proxy)
model_list:
  - model_name: claude-sonnet-4-6
    litellm_params:
      model: anthropic/claude-sonnet-4-6
      api_key: os.environ/ANTHROPIC_API_KEY

mcp_servers:
  automated_operations:
    url: https://mcp.automatedoperations.com
    transport: http
    auth_type: bearer
    auth_value: os.environ/AO_TOKEN
    description: Automated Operations infrastructure tools

litellm_settings:
  enable_mcp: true

After this, every model behind your LiteLLM proxy gets access to AO tools. Bind the MCP server to specific virtual keys for per-tenant scoping.

python · client
import os
from litellm import experimental_mcp_client

tools = await experimental_mcp_client.load_mcp_tools(
    server={
        "url": "https://mcp.automatedoperations.com",
        "transport": "http",
        "headers": {"Authorization": f"Bearer {os.environ['AO_TOKEN']}"},
    }
)

response = await litellm.acompletion(
    model="anthropic/claude-sonnet-4-6",
    messages=[{"role": "user", "content": "Why is api-prod-3 OOMing?"}],
    tools=tools,
)

Anthropic SDK

Direct API integration (Python / TypeScript).

typescript
import Anthropic from '@anthropic-ai/sdk';

const client = new Anthropic();

const res = await client.beta.messages.create({
  model: 'claude-sonnet-4-6',
  max_tokens: 1024,
  messages: [{ role: 'user', content: 'List failing pods in production.' }],
  mcp_servers: [
    {
      type: 'url',
      url: 'https://mcp.automatedoperations.com',
      name: 'automated-operations',
      authorization_token: process.env.AO_TOKEN!
    }
  ]
});
python
import os
from anthropic import Anthropic

client = Anthropic()

res = client.beta.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=1024,
    messages=[{"role": "user", "content": "List failing pods in production."}],
    mcp_servers=[
        {
            "type": "url",
            "url": "https://mcp.automatedoperations.com",
            "name": "automated-operations",
            "authorization_token": os.environ["AO_TOKEN"],
        }
    ],
)

OpenAI Agents / SDK

Tool-calling shim for non-MCP-native models.

python · Agents SDK
import os
from agents import Agent, Runner
from agents.mcp import MCPServerStreamableHttp

ao = MCPServerStreamableHttp(
    name="automated-operations",
    params={
        "url": "https://mcp.automatedoperations.com",
        "headers": {"Authorization": f"Bearer {os.environ['AO_TOKEN']}"},
    },
)

agent = Agent(
    name="ops",
    instructions="You are an SRE. Use AO tools to investigate.",
    mcp_servers=[ao],
)

result = await Runner.run(agent, "Why is checkout p99 latency up?")
print(result.final_output)

Don't see your client?

If your tool speaks MCP over streamable HTTP or SSE, the configuration above will work — just rename the keys to match your client's schema. If it doesn't, we maintain shims for OpenAI tool-calling and Google Vertex tool-use that wrap the same backend. Email us with your client and we will help you wire it up the same day.