An MCP (Model Context Protocol) server that provides real-time LLM token pricing data for 60+ AI models across 15 providers.
-
Updated
Mar 14, 2026 - JavaScript
An MCP (Model Context Protocol) server that provides real-time LLM token pricing data for 60+ AI models across 15 providers.
OpenLLM Monitor ๐ is a plug-and-play, real-time observability dashboard ๐ for monitoring and debugging LLM API calls across OpenAI ๐ค, Ollama ๐ฆ, OpenRouter ๐, and more. It tracks tokens ๐งฎ, latency โฑ๏ธ, cost ๐ธ, retries ๐, and lets you replay prompts ๐. Fully open-source ๐ and self-hostable ๐ ๏ธ.
The Hidden Token Tax: Quantifying the True Cost of AI Browser Automation โ empirical benchmark of @playwright/cli vs @playwright/mcp vs CDP
Add a description, image, and links to the token-cost topic page so that developers can more easily learn about it.
To associate your repository with the token-cost topic, visit your repo's landing page and select "manage topics."