Google Cloud Logging MCP server
Create a powerful Model Context Protocol (MCP) server for Google Cloud Logging to collect, search, analyze, and export logs from across your GCP infrastructure and applications. This integration enables AI agents to manage logs, create metrics from log data, export logs to data warehouses, and set up alerts—all with secure service account authentication.
Setting up an MCP server
This article covers the standard steps for creating an MCP server in AI Gateway and connecting it to an AI client. The steps are the same for every integration — application-specific details (API credentials, OAuth endpoints, and scopes) are covered in the individual application pages.
Before you begin
You'll need:
- Access to AI Gateway with permission to create MCP servers
- API credentials for the application you're connecting (see the relevant application page for what to collect)
Create an MCP server
Find the API in the catalog
- Sign in to AI Gateway and select MCP Servers from the left navigation.
- Select New MCP Server.
- Search for the application you want to connect, then select it from the catalog.
Configure the server
- Enter a Name for your server — something descriptive that identifies both the application and its purpose (for example, "Zendesk Support — Prod").
- Enter a Description so your team knows what the server is for.
- Set the Timeout value. 30 seconds works for most APIs; increase to 60 seconds for APIs that return large payloads.
- Toggle Production mode on if this server will be used in a live workflow.
- Select Next.
Configure authentication
Enter the authentication details for the application. This varies by service — see the Authentication section of the relevant application page for the specific credentials, OAuth URLs, and scopes to use.
Configure security
- Set any Rate limits appropriate for your use case and the API's own limits.
- Enable Logging if you want AI Gateway to record requests and responses for auditing.
- Select Next.
Deploy
Review the summary, then select Deploy. AI Gateway provisions the server and provides a server URL you'll use when configuring your AI client.
Connect to an AI client
Once your server is deployed, you'll need to add it to the AI client your team uses. Select your client for setup instructions:
Tips
- You can create multiple MCP servers for the same application — for example, a read-only server for reporting agents and a read-write server for automation workflows.
- If you're unsure which OAuth scopes to request, start with the minimum read-only set and add write scopes only when needed. Most application pages include scope recommendations.
- You can edit a server's name, description, timeout, and security settings after deployment without redeploying.
Authentication
Google Cloud Logging uses OAuth 2.0 with service accounts for API access. You'll create a service account in your Google Cloud project and download a JSON key file. The service account needs the Logging Admin role (or specific permissions like logging.logEntries.create, logging.logEntries.list, logging.logs.list, logging.sinks.create, logging.metrics.create). The Google OAuth endpoint is https://oauth2.googleapis.com/token, and the integration requires scope https://www.googleapis.com/auth/cloud-platform for full Logging access.
Available tools
These tools let AI agents write logs, search for entries, create metrics from logs, configure sinks for export, and manage retention. Together they provide comprehensive observability for debugging, monitoring, and compliance.
| Tool | Description |
|---|---|
| Write log entry | Write a structured or plain-text log entry |
| List log entries | Query logs by resource type, severity, or time range |
| Tail logs | Stream new log entries in real-time |
| List logs | View all available logs in a project |
| Delete logs | Remove old log entries to manage storage |
| Create log sink | Export logs to BigQuery, Cloud Storage, or Pub/Sub |
| Get log sink | View sink configuration and destination |
| Update log sink | Change filter or destination |
| Delete log sink | Remove a sink |
| Create log metric | Generate a metric from log entries matching a filter |
| Get log metric | View metric configuration |
| Update log metric | Change metric filter or name |
| Delete log metric | Remove a metric |
| List exclusions | View log exclusion rules |
| Create exclusion | Exclude certain log patterns to reduce volume and cost |
Tips
Write logs as structured JSON with meaningful fields (service name, request ID, user ID) instead of plain text — this makes logs queryable and easier to analyze.
Create log-based metrics for critical events (errors, authentication failures) to surface important patterns and anomalies.
Configure alerting policies to notify your team immediately when metrics exceed critical thresholds.
Use exclusions to filter out noisy or low-value logs (for example, health checks, debug logs) before they're stored, reducing your logging bill significantly.
Create sinks to export logs to BigQuery for long-term analysis and compliance.
Also export logs to Cloud Storage for archival and audit purposes if your compliance requirements demand it.
Include correlation IDs and trace IDs in your logs so you can follow a user request through multiple services and understand the full execution path.
Cequence AI Gateway