Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
157 changes: 157 additions & 0 deletions content/guides/grafana-mcp-server-gemini.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,157 @@
---
description: Integrate Gemini CLI with Grafana via Docker MCP Toolkit for natural language observability.
keywords: mcp, grafana, docker, gemini, devops
title: Connect Gemini to Grafana via MCP
summary: |
Learn how to leverage the Model Context Protocol (MCP) to interact with Grafana dashboards and datasources directly from your terminal.
levels: [intermediate]
subjects: [devops]
aliases:
- /guides/use-case/devops/
params:
time: 15 minutes
---

# Integrating Gemini CLI with Grafana via Docker MCP Toolkit

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wordy phrasing

"outlines the process of connecting" is unnecessarily verbose.

Suggestion: Simplify to:

This guide shows how to connect Gemini CLI to a Grafana instance using the **Docker MCP Toolkit**.

Per the style guide's "Quick transformations" guidance, prefer direct, concise language.

This guide outlines the process of connecting Gemini CLI to a Grafana instance using the **Docker MCP Toolkit**. By leveraging the Model Context Protocol (MCP), we enable natural language querying of observability data directly from the terminal.

Check warning on line 17 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Docker.We] Avoid using first-person plural like 'we'. Raw Output: {"message": "[Docker.We] Avoid using first-person plural like 'we'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 17, "column": 162}}}, "severity": "WARNING"}

## Prerequisites

* **Gemini CLI** installed and authenticated.
* **Docker Desktop** with the **MCP Toolkit** extension enabled.
* An active **Grafana** instance.


## 1. Provisioning Grafana Access

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Missing explanation for security recommendation

The statement "Using a personal API key is discouraged for security best practices" doesn't explain WHY this matters to users.

Suggestion: Add a brief explanation:

The MCP server requires a **Service Account Token** to interact with the Grafana API. Service Account Tokens are preferred over personal API keys because they can be revoked independently without affecting user access, and permissions can be scoped more narrowly.

This helps users understand the security benefit and makes the recommendation actionable.

The MCP server requires a **Service Account Token** to interact with the Grafana API. Using a personal API key is discouraged for security best practices.

1. Navigate to **Administration > Users and access > Service accounts** in your Grafana dashboard.
2. Create a new Service Account (e.g., `gemini-mcp-connector`).
3. Assign the **Viewer** role (or **Editor** if you require alert management capabilities).
4. Generate a new token. **Copy the token value immediately**; it will not be displayed again.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Passive voice

"it will not be displayed again" uses passive voice unnecessarily.

Suggestion:

Copy the token immediately—you won't be able to view it again.

Active voice is more direct and personal per the style guide.


![Create a service account in Grafana](images/create-sa-grafana.webp)



## 2. MCP Server Configuration

The Docker MCP Toolkit provides a pre-configured Grafana catalog item. This acts as the bridge between the LLM and the Grafana API.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Marketing metaphor

"This acts as the bridge between" uses metaphorical language instead of being direct.

Suggestion:

This connects the LLM to the Grafana API.

Per the style guide on avoiding marketing language: "Be concise. Don't bulk up communication with fluffy words or complex metaphors."

1. Open the **MCP Toolkit** in Docker Desktop.
2. Locate **Grafana** in the Catalog and add it to your active servers.
3. In the **Configuration** view, define the following:
* **Grafana URL:** The endpoint or URL of your instance.
* **Service Account Token:** The token generated in the previous step.

![Configure mcp grafana in docker](images/configure-mcp-grafana.webp)

Check failure on line 49 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Terms] Use 'Grafana' instead of 'grafana'. Raw Output: {"message": "[Vale.Terms] Use 'Grafana' instead of 'grafana'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 49, "column": 17}}}, "severity": "ERROR"}

Check failure on line 49 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'mcp'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'mcp'?", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 49, "column": 13}}}, "severity": "ERROR"}



## 3. Gemini CLI Integration

To register the Docker MCP gateway within Gemini, update your global configuration file located at `~/.gemini/settings.json`.

Ensure the `mcpServers` object includes the following entry:

```json
{
"mcpServers": {
"MCP_DOCKER": {
"command": "docker",
"args": [
"mcp",
"gateway",
"run"
]
}
}
}

```


## 4. Operational Validation

Restart your Gemini CLI session to load the new configuration. Verify the status of the MCP tools by running:

```bash
> /mcp list

```

![MCP Docker added to gemini cli](images/mcp-docker-gemini.webp)

Check failure on line 85 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Terms] Use 'CLI' instead of 'cli'. Raw Output: {"message": "[Vale.Terms] Use 'CLI' instead of 'cli'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 85, "column": 30}}}, "severity": "ERROR"}

Check failure on line 85 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'gemini'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'gemini'?", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 85, "column": 23}}}, "severity": "ERROR"}

A successful connection will show `MCP_DOCKER` as **Ready**, exposing over 61 tools for data fetching, dashboard searching, and alert inspection.

## Use Cases

### Datasource Discovery

Check failure on line 91 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'Datasource'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'Datasource'?", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 91, "column": 5}}}, "severity": "ERROR"}

_List all Prometheus and Loki datasources._

![List datasources](images/gemini-grafana-list-datasources.webp)



![List datasources](images/list-datasources-result.webp)


### Logs Inspection

The sequence initiates with the User Prompt: "I would like to filter logs based on the device_name=edge-device-01 label. Are there logs about nginx in the last 5 minutes?". At this stage, the Gemini model performs intent parsing. It identifies the specific metadata required—a label (device_name) and a keyword (nginx)—and realizes it needs external data to fulfill the request. This triggers the list_datasources tool through the MCP Server to locate the telemetry backend.

Check failure on line 104 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Terms] Use 'Nginx' instead of 'nginx'. Raw Output: {"message": "[Vale.Terms] Use 'Nginx' instead of 'nginx'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 104, "column": 313}}}, "severity": "ERROR"}

Check failure on line 104 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'device_name'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'device_name'?", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 104, "column": 285}}}, "severity": "ERROR"}

Check failure on line 104 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Terms] Use 'Nginx' instead of 'nginx'. Raw Output: {"message": "[Vale.Terms] Use 'Nginx' instead of 'nginx'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 104, "column": 143}}}, "severity": "ERROR"}

Check failure on line 104 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Vale.Spelling] Did you really mean 'device_name'? Raw Output: {"message": "[Vale.Spelling] Did you really mean 'device_name'?", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 104, "column": 88}}}, "severity": "ERROR"}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unclear technical assumption

The statement "This query targets the specific Kubernetes pod logs, extracting raw OpenTelemetry (OTel) data that includes pod UIDs and container metadata" assumes Kubernetes context that wasn't established earlier.

Suggestion: Either:

  1. Show a Kubernetes-specific LogQL query example, OR
  2. Remove the Kubernetes reference and keep it generic: "This query targets logs matching the device label, extracting raw log data..."

The current phrasing makes assumptions about the query's target that aren't evident from the LogQL shown.

![Filter logs based on loki labels](images/mcp-docker-grafana-loki-1.webp)

Once the system identifies Loki as the active datasource, it translates the human intent into a precise technical command. The AI autonomously constructs a LogQL `query: {device_name="edge-device-01"} |= "nginx"`. This query targets the specific Kubernetes pod logs, extracting raw OpenTelemetry (OTel) data that includes pod UIDs and container metadata. Instead of the user writing complex syntax, the prompt acts as the bridge to pull structured data from the containerized environment


![Gemini gets the Grafana's logs from MCP docker](images/mcp-docker-grafana-loki-2.webp)

In the final step, Gemini performs reasoning over the raw data. It filters through hundreds of lines of telemetry to confirm that Nginx logs exist, but it goes beyond the original prompt by identifying a critical anomaly. It flags a `node_filesystem_device_error`, alerting the DevOps engineer to a potential hardware or volume mounting issue on the edge node. The process concludes by turning a simple question into an actionable incident report.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unclear AI behavior description

The statement that Gemini "goes beyond the original prompt by identifying a critical anomaly" creates ambiguity. Was the node_filesystem_device_error actually present in the Loki logs, or did Gemini infer it from limited data?

Suggestion: Clarify what actually happened:

  • If the error was in the logs: "Gemini analyzes the logs and identifies a node_filesystem_device_error in the data..."
  • If Gemini inferred it: "Based on the log patterns, Gemini identifies a potential node_filesystem_device_error..."

This factual clarity helps users understand what MCP/Gemini actually does versus what it infers.


![Gemini gives an overall about the findings](images/mcp-docker-grafana-loki-3.webp)



### Dashboard Navigation

_How many dashboards we have?_

Check warning on line 121 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Docker.We] Avoid using first-person plural like 'we'. Raw Output: {"message": "[Docker.We] Avoid using first-person plural like 'we'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 121, "column": 22}}}, "severity": "WARNING"}

![List datasources](images/mcp-grafana-dashboards.webp)


_Tell me the summary of X dashboard_


![List datasources](images/mcp-grafana-summary-dashboard.webp)

### Other scenarios

Imagine you get a page that an application is slow. You could:

1. Use list_alert_rules to see which alert is firing.
2. Use search_dashboards to find the relevant application dashboard.
3. Use get_panel_image on a key panel to see the performance spike visually.
4. Use query_loki_logs to search for "error" or "timeout" messages during the time of the spike.
5. If you find the root cause, use create_incident to start the formal response and add_activity_to_incident to log your findings.

## Next steps?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Promotional tone shift

The "Next steps?" section shifts from instructional to promotional with phrases like "Don't let critical logs get buried under layers of infrastructure noise."

Suggestion: Provide concrete next steps instead:

## Next steps

- Learn about [Advanced LogQL queries](/link/to/guide)
- Set up [Team-wide MCP configurations](/link/to/guide)
- Explore [Grafana alerting with MCP](/link/to/guide)
- Get help in the [Docker Community Forums](https://forums.docker.com)

Focus on actionable resources rather than motivational statements.

This use case demonstrates the future of Operational Intelligence: moving away from manual dashboard hunting and complex query syntax toward a conversational, proactive troubleshooting experience.

By bridging the gap between your terminal and Grafana's telemetry via the Docker MCP Toolkit, you empower your DevOps team to detect silent failures—like the filesystem error identified in our example—before they escalate into full-scale outages.

Check warning on line 145 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Docker.We] Avoid using first-person plural like 'our'. Raw Output: {"message": "[Docker.We] Avoid using first-person plural like 'our'.", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 145, "column": 190}}}, "severity": "WARNING"}

Don't let critical logs get buried under layers of infrastructure noise. Start automating your incident response and log analysis today.

Take the next step:

- Deploy the connector: Follow the 15-minute guide above to link your local Gemini CLI to your production Grafana instance.

Check warning on line 151 in content/guides/grafana-mcp-server-gemini.md

View workflow job for this annotation

GitHub Actions / validate (vale)

[vale] reported by reviewdog 🐶 [Docker.RecommendedWords] Consider using 'previous' instead of 'above' Raw Output: {"message": "[Docker.RecommendedWords] Consider using 'previous' instead of 'above'", "location": {"path": "content/guides/grafana-mcp-server-gemini.md", "range": {"start": {"line": 151, "column": 52}}}, "severity": "INFO"}

- Scale the solution: Explore how to share these MCP configurations across your SRE team for unified troubleshooting.

- Optimize your queries: Experiment with advanced LogQL prompts to create automated health reports.

Need help setting up your Docker MCP environment or customizing your Gemini prompts? Let's know.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Incomplete help information

"Let's know" is grammatically incomplete and provides no actionable way for users to get help.

Suggestion: Replace with concrete resources:

Need help setting up your Docker MCP environment or customizing your Gemini prompts? Visit the [Docker Community Forums](https://forums.docker.com) or see the [MCP Troubleshooting Guide](/path/to/guide).

Provide actual links or contact methods so users know where to get support.

Binary file added content/guides/images/configure-mcp-grafana.webp
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added content/guides/images/create-sa-grafana.webp
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added content/guides/images/mcp-docker-gemini.webp
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added content/guides/images/mcp-grafana-dashboards.webp
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading