ContextCore: An MCP server for Claude (or any AI tool) that enables massive token saving through hybrid search (BM25 + Embeddings)
-
Updated
Apr 27, 2026 - Python
ContextCore: An MCP server for Claude (or any AI tool) that enables massive token saving through hybrid search (BM25 + Embeddings)
Local-first Trello cache with Git-style sync — built and optimised for AI agent workflows.
Project-agnostic dual-memory MCP CLI for Claude Code, Cursor, and OpenCode (Qdrant tuned hybrid retrieval + structural memory hooks)
Auto model-switching plugin for Claude Code — routes prompts to haiku/sonnet/opus (or any custom tier) to save API tokens
Zero-dependency Q&A cache for OpenClaw - SQLite-based, no Redis/Embedding API needed. Reduce token consumption with keyword matching + edit distance.
Auto-trigger graphify knowledge-graph queries on every LLM prompt + MCP shell delegation for Claude Code / Cowork agents.
Add a description, image, and links to the token-saving topic page so that developers can more easily learn about it.
To associate your repository with the token-saving topic, visit your repo's landing page and select "manage topics."