One MCP server. 20 databases. Zero context switching.
Query PostgreSQL, MongoDB, Neo4j, Elasticsearch, Redis, and 15 more databases through a single unified interface. Built with ReScript, runs on Deno.
Modern applications use multiple databases — SQL for transactions, Redis for caching, Elasticsearch for search, vectors for AI. Switching between CLIs, APIs, and query languages is exhausting.
Polyglot DB MCP gives Claude (and other MCP clients) direct access to all your databases through natural language:
Find all users in PostgreSQL who signed up last week, then check if they’re in the Redis cache
Search Elasticsearch for 'authentication errors' and correlate with the MongoDB audit log
Store this embedding in Qdrant and link it to the Neo4j knowledge graph
# Clone the repository
git clone https://github.com/hyperpolymath/polyglot-db-mcp.git
cd polyglot-db-mcp
# Add to Claude Code
claude mcp add polyglot-db -- deno run \
--allow-net --allow-read --allow-write --allow-env \
$(pwd)/server.js# Using nerdctl (containerd)
nerdctl run -d --name polyglot-db \
-e POSTGRES_HOST=host.docker.internal \
-e MONGODB_URL=mongodb://host.docker.internal:27017 \
ghcr.io/hyperpolymath/polyglot-db-mcp:latest
# Using podman
podman run -d --name polyglot-db \
-e POSTGRES_HOST=host.containers.internal \
ghcr.io/hyperpolymath/polyglot-db-mcp:latest
# Using docker
docker run -d --name polyglot-db \
-e POSTGRES_HOST=host.docker.internal \
ghcr.io/hyperpolymath/polyglot-db-mcp:latestPolyglot DB MCP supports the MCP Streamable HTTP transport (June 2025 spec) for cloud deployment:
# Deploy to Deno Deploy
deployctl deploy --project=polyglot-db-mcp server.js
# Or run HTTP mode locally
deno task serveOnce deployed, configure your MCP client to connect via HTTP:
{
"mcpServers": {
"polyglot-db": {
"transport": {
"type": "http",
"url": "https://polyglot-db-mcp.deno.dev/mcp"
}
}
}
}HTTP endpoints:
-
GET /health- Health check -
GET /info- Server information -
POST /mcp- MCP Streamable HTTP endpoint (JSON-RPC 2.0)
Create a .env file or export environment variables:
# Example: PostgreSQL + Redis + Elasticsearch
export POSTGRES_HOST=localhost POSTGRES_DATABASE=myapp
export DRAGONFLY_HOST=localhost
export ELASTICSEARCH_URL=http://localhost:9200This MCP server supports both deployment modes:
| Mode | Supported | Permissions | Notes |
|---|---|---|---|
Local-Agent (stdio) |
✓ Yes |
|
Primary mode. Direct database connections. |
Hosted-HTTP |
✓ Yes |
N/A (HTTP transport) |
Deploy to Deno Deploy or any HTTP host. |
|
Note
|
Unlike CLI-wrapping MCPs, Polyglot DB MCP uses direct database drivers via npm packages. There is no shell execution ( |
What this means:
-
Database credentials are passed via environment variables
-
Connections are made directly from the MCP server to databases
-
No external CLI tools are executed
-
The security boundary is your database authentication
Recommendations:
-
Use database users with minimal necessary permissions
-
Consider read-only users for exploration/query tools
-
Avoid superuser/admin credentials
-
Use connection encryption (SSL/TLS) where available
-
Set network ACLs on database servers
-
Review tool calls before approving in your MCP client
Verify the server is working correctly:
# 1. Set at least one database connection
export POSTGRES_HOST=localhost POSTGRES_DATABASE=postgres
# 2. Start the server
deno run --allow-net --allow-read --allow-write --allow-env server.js &
SERVER_PID=$!
# 3. The MCP client should be able to call:
# - db_list (lists all supported databases)
# - db_status (shows connected databases)
# - pg_query "SELECT 1" (if PostgreSQL configured)
# 4. Cleanup
kill $SERVER_PIDExpected smoke test results:
| Check | Expected Result |
|---|---|
Server starts |
No errors, listens on stdio (or HTTP port) |
db_list works |
Returns list of 20 supported databases |
db_status works |
Shows which databases have valid connections |
pg_query (if PostgreSQL) |
Returns query results |
| Database | License | Best For | Tools |
|---|---|---|---|
PostgreSQL |
PostgreSQL (FOSS) |
Complex queries, ACID, extensions (PostGIS, pgvector) |
|
MariaDB |
GPL v2 (FOSS) |
Web apps, MySQL compatibility |
|
SQLite |
Public Domain |
Local storage, embedded, single-file |
|
| Database | License | Best For | Tools |
|---|---|---|---|
MongoDB |
SSPL |
Flexible schemas, horizontal scaling |
|
SurrealDB |
BSL/Apache 2.0 |
Multi-model (doc + graph + SQL) |
|
ArangoDB |
Apache 2.0 (FOSS) |
Multi-model (doc + graph + KV), AQL |
|
CouchDB |
Apache 2.0 (FOSS) |
Document DB with HTTP API, Mango queries |
|
| Database | License | Best For | Tools |
|---|---|---|---|
Cassandra |
Apache 2.0 (FOSS) |
Distributed, high availability, time-series |
|
| Database | License | Best For | Tools |
|---|---|---|---|
Neo4j |
GPL v3 / Commercial |
Relationships, social networks, fraud detection |
|
Virtuoso |
GPL v2 / Commercial |
RDF triplestore, SPARQL, linked data |
|
| Database | License | Best For | Tools |
|---|---|---|---|
Dragonfly |
BSL |
Redis replacement, 25x faster |
|
Memcached |
BSD (FOSS) |
Simple distributed caching |
|
LMDB |
OpenLDAP (FOSS) |
Embedded KV with ACID |
|
| Database | License | Best For | Tools |
|---|---|---|---|
Elasticsearch |
Elastic License 2.0 |
Full-text search, log analytics |
|
Meilisearch |
MIT (FOSS) |
Instant, typo-tolerant search |
|
| Database | License | Best For | Tools |
|---|---|---|---|
Qdrant |
Apache 2.0 (FOSS) |
AI embeddings, semantic search |
|
| Database | License | Best For | Tools |
|---|---|---|---|
DuckDB |
MIT (FOSS) |
OLAP, query CSV/Parquet/JSON directly |
|
Each database reads from environment variables. Only configure what you need.
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DATABASE=mydb
POSTGRES_USER=postgres
POSTGRES_PASSWORD=secret
# Connection pool (optional)
POSTGRES_POOL_MAX=10 # Max connections
POSTGRES_IDLE_TIMEOUT=30 # Seconds before idle connection closed
POSTGRES_CONNECT_TIMEOUT=10 # Connection timeout in secondsMONGODB_URL=mongodb://localhost:27017
MONGODB_DATABASE=mydb
# Connection pool (optional)
MONGODB_POOL_MAX=10 # Max connections
MONGODB_POOL_MIN=1 # Min connections
MONGODB_IDLE_TIMEOUT=30000 # Idle timeout in ms
MONGODB_CONNECT_TIMEOUT=10000 # Connect timeout in ms
MONGODB_SERVER_TIMEOUT=30000 # Server selection timeout in msELASTICSEARCH_URL=http://localhost:9200
ELASTICSEARCH_USERNAME=elastic # optional
ELASTICSEARCH_PASSWORD=secret # optionalINFLUXDB_URL=http://localhost:8086
INFLUXDB_TOKEN=your-token
INFLUXDB_ORG=your-org
INFLUXDB_BUCKET=your-bucketSURREAL_URL=http://localhost:8000
SURREAL_NAMESPACE=test
SURREAL_DATABASE=test
SURREAL_USERNAME=root
SURREAL_PASSWORD=rootARANGO_URL=http://localhost:8529
ARANGO_DATABASE=_system
ARANGO_USERNAME=root
ARANGO_PASSWORD=VIRTUOSO_ENDPOINT=http://localhost:8890/sparql
VIRTUOSO_UPDATE_ENDPOINT=http://localhost:8890/sparql-auth
VIRTUOSO_USERNAME=
VIRTUOSO_PASSWORD=
VIRTUOSO_DEFAULT_GRAPH=COUCHDB_URL=http://localhost:5984
COUCHDB_USERNAME=admin
COUCHDB_PASSWORD=secret
COUCHDB_DATABASE=mydbCASSANDRA_CONTACT_POINTS=localhost # comma-separated list
CASSANDRA_DATACENTER=datacenter1
CASSANDRA_KEYSPACE=mykeyspace
CASSANDRA_USERNAME=cassandra
CASSANDRA_PASSWORD=cassandraMARIADB_HOST=localhost
MARIADB_PORT=3306
MARIADB_USER=root
MARIADB_PASSWORD=secret
MARIADB_DATABASE=mydb
# Connection pool (optional)
MARIADB_POOL_MAX=10 # Max connections
MARIADB_ACQUIRE_TIMEOUT=10000 # Acquire timeout in ms
MARIADB_IDLE_TIMEOUT=30000 # Idle timeout in ms
MARIADB_CONNECT_TIMEOUT=10000 # Connect timeout in msdb_list List all 18 supported databases db_status Check which databases are currently connected db_help [database] Get available tools for a specific database
Ask Claude things like:
PostgreSQL:
Create a users table with id, email, and created_at columns
Find all orders over $100 from the last month
MongoDB:
Insert a new document into the products collection
Aggregate sales by category with a $match and $group pipeline
Neo4j:
Find the shortest path between User:alice and User:bob
Show all nodes connected to the 'Engineering' department
Elasticsearch:
Search for documents containing 'critical error' in the logs index
Get the mapping for the products index
Redis/Dragonfly:
Set user:123:session with a 30 minute TTL
Get all keys matching cache:*
Qdrant:
Search for vectors similar to this embedding in the documents collection
Create a new collection with 1536 dimensions for OpenAI embeddings
Cross-Database:
Query users from PostgreSQL and cache the result in Redis
Find products in MongoDB and index them in Meilisearch
polyglot-db-mcp/ ├── index.js # MCP server entry point ├── src/ # ReScript source (core adapters) │ ├── Adapter.res # Shared types │ ├── bindings/ # Database client FFI │ └── adapters/ # PostgreSQL, MongoDB, SQLite, Dragonfly, Elasticsearch ├── adapters/ # JavaScript adapters (exotic databases) ├── lib/es6/ # Compiled ReScript output └── STATE.scm # Project state tracking
| Component | Language | Rationale |
|---|---|---|
Core adapters (5) |
ReScript |
Type safety, smaller bundles |
Exotic adapters (11) |
JavaScript |
Pragmatic for v1.x |
Future (v2.0.0) |
100% ReScript |
RSR Gold compliance |
|
Important
|
TypeScript is prohibited. We chose ReScript for its superior type inference and ML heritage. |
npm install # Install ReScript compiler
npm run res:build # Compile to JavaScript
npm run res:watch # Watch modedeno task start
# or
deno run --allow-net --allow-read --allow-write --allow-env index.jsCreate src/adapters/YourDb.res:
// SPDX-License-Identifier: MIT
open Adapter
let name = "yourdb"
let description = "Your database description"
let connect = async () => { /* ... */ }
let disconnect = async () => { /* ... */ }
let isConnected = async () => { /* ... */ }
let tools: Js.Dict.t<toolDefAny> = {
let dict = Js.Dict.empty()
Js.Dict.set(dict, "yourdb_query", {
description: "Execute a query",
params: makeParams([("query", stringParam("Query to run"))]),
handler: queryHandler,
})
dict
}Then import in index.js and rebuild.
| Version | Status | Highlights |
|---|---|---|
1.0.0 |
Released |
16 databases, ReScript core, CI/CD |
1.1.0 |
In Progress |
Connection pooling, container images, better errors |
1.2.0 |
Planned |
Cross-database pipelines, caching helpers |
2.0.0 |
Vision |
100% ReScript, RSR Gold compliance, FormBD integration |
FormBD is a narrative-first, reversible, audit-grade database. When its API stabilizes, poly-db-mcp will add:
-
formbd_query— Execute FQL queries with provenance -
formbd_explain— Get plan + reasons for queries -
formbd_introspect— Schema and constraint inspection -
formbd_journal— Access audit trail and reversibility primitives
-
arango-mcp — ArangoDB MCP server
-
virtuoso-mcp — Virtuoso SPARQL MCP server
Dual-licensed: MIT OR AGPL-3.0-or-later — your choice.
We encourage (but don’t require) layering the Palimpsest License for ethical AI development practices.
© 2025 Jonathan D.A. Jewell (@hyperpolymath)