Skip to main content

Grafana Tempo MCP

The Grafana Tempo Model Context Protocol (MCP) server gives AI assistants and LLMs direct access to your distributed tracing data through TraceQL queries and other endpoints. Connect to your Tempo instance (self-hosted or Grafana Cloud) from Cequence AI Gateway to run TraceQL queries and analyze traces from your AI client.

1. Overview

Grafana Tempo MCP is a remote MCP server that you point at your own Tempo stack. The server URL is not fixed; you provide your Tempo stack host (e.g. Grafana Cloud Traces or a self-hosted Tempo deployment).

  • Server URL (template): https://{your-tempo-stack-url}/tempo/api/mcp
  • Example: For Grafana Cloud, your stack URL might be something like tempo.grafana.net or the host of your Grafana Cloud Traces instance. For self-hosted Tempo, use the host where Tempo is reachable (e.g. tempo.example.com).
  • Transport: HTTP (Streamable HTTP)
  • Hosted by: You (self-hosted Tempo) or Grafana Cloud (Grafana Cloud Traces)

Tempo must have the MCP server enabled in its configuration (see Configuration below). Using this feature may send tracing data to an LLM or LLM provider; consider your data and organizational policies before enabling.

2. Supported authentication types

TypeSupportedNotes
Basic authYesRequired for Cequence AI Gateway. Provide a token as a basic auth credential: typically username:token or token:token. Configure during gateway creation.
OAuthNoNot used for Tempo MCP in the gateway.

When you add Grafana Tempo MCP in Cequence AI Gateway, you supply your Tempo stack URL and basic auth credentials (e.g. Grafana Cloud API token or your Tempo instance’s auth token).

3. What can you do with this MCP server

With the Grafana Tempo MCP server, you can:

  • Query traces with TraceQL — Run TraceQL queries against your Tempo instance to find and filter traces.
  • Analyze distributed tracing data — Let AI assistants reason over spans, latencies, and service dependencies.
  • Get LLM-powered insights — Use the MCP server from Cursor, Claude Code, or other clients (e.g. via mcp-remote) for natural-language exploration of your tracing data.

For examples and use cases, see LLM-powered insights into your tracing data: introducing MCP support in Grafana Cloud Traces.

4. Prerequisites

Before adding Grafana Tempo MCP in Cequence AI Gateway, ensure you have:

  • Access to Cequence AI Gateway (e.g. beta.aigateway.cequence.ai)
  • A Grafana Tempo instance (self-hosted or Grafana Cloud Traces) with the MCP server enabled (see Configuration).
  • Your Tempo stack URL (host only, e.g. tempo.grafana.net or tempo.example.com) — no path; the gateway uses the template https://{your-tempo-stack-url}/tempo/api/mcp.
  • Basic auth credentials — A token (or username + token) that Tempo accepts. For Grafana Cloud, this is typically an API token; for self-hosted, use whatever basic auth your deployment expects (e.g. token:token or a dedicated username and token).

Configuration (Tempo side)
Enable the MCP server in your Tempo config:

query_frontend:
mcp_server:
enabled: true

5. Example workflows

  • “Find traces for service X that have errors in the last hour.”
  • “Show me the slowest traces for operation Y.”
  • “Which services does trace ID Z call?”
  • Run a TraceQL query from your AI client to filter by duration, status, or attributes.

6. Connecting MCP server from Cequence AI Gateway

  1. Log in to Cequence AI Gateway.
  2. Choose your tenant.
  3. Go to App catalogue.
  4. Filter by Remote MCP server.
  5. Search for Grafana Tempo MCP and then select it.
  6. Click Create MCP server.
  7. Enter your Tempo stack URL when prompted (e.g. tempo.grafana.net or your self-hosted host). Do not include https:// or the path; the gateway builds https://{your-tempo-stack-url}/tempo/api/mcp.
  8. Configure basic auth: provide the token (and username if required) as specified by your Tempo or Grafana Cloud setup.
  9. Complete the setup as prompted, select tools, and deploy.

Use the generated MCP server URL in your client as described in the Client Configuration docs. For detailed UI steps and screenshots, see Create a third-party MCP Server.

7. Additional information

  • Configuration: The MCP server is enabled under query_frontend.mcp_server.enabled: true in Tempo. See Tempo MCP server docs.
  • Data and privacy: Enabling this feature can cause tracing data to be sent to an LLM or LLM provider. Consider the sensitivity of your traces and organizational policies.
  • Self-hosted: For self-hosted Tempo, the MCP endpoint is often at http://localhost:3200/api/mcp when running locally; in production, use your public or internal host in the URL template.
  • Official documentation: Tempo MCP server, MCP documentation, TraceQL.