A JSON-based RAG chatbot that helps team members instantly find the right person to contact for any issue. Built with FastAPI, BM25 retrieval, and Portkey AI — themed with the Portkey Design System.
User Query
│
▼
┌──────────────┐ ┌───────────────────┐ ┌────────────────────┐
│ BM25 Index │────▶│ Retrieve top-k │────▶│ Inject context │
│ (built from │ │ matching team │ │ into system │
│ JSON KB) │ │ entries by score │ │ prompt │
└──────────────┘ └───────────────────┘ └────────┬───────────┘
│
▼
┌────────────────────┐
│ Portkey Chat │
│ Completion (SSE │
│ streaming) │
└────────┬───────────┘
│
▼
Streamed Response
- Knowledge Base — A flat JSON file (
knowledge_base.json) stores team members with their name, role, and a natural-language description of what they handle. - BM25 Retrieval — At startup, descriptions are tokenized into a BM25Okapi index. Each user query is scored against all entries and the top matches are selected as context.
- LLM Generation — The retrieved context is injected into a system prompt and sent to an LLM via Portkey, which streams a response back to the UI over SSE.
| Name | Role | Handles |
|---|---|---|
| Visarg | MCP Specialist | MCP server config, integration debugging, protocol, tool registry |
| Avanish | Solution Architect | Deployments, infra, CI/CD, Docker, K8s, DNS, SSL, DevOps |
| Ayush | CTO | Engineering escalations, critical decisions, production incidents |
FDE_helper/
├── app.py # FastAPI backend (BM25 retrieval + Portkey streaming)
├── knowledge_base.json # Team knowledge base (flat JSON)
├── portkey-theme.css # Portkey Design System reference
├── templates/
│ └── index.html # Chat UI (Portkey-themed, vanilla HTML/CSS/JS)
├── assets/
│ ├── chat_home.png # Screenshot — home screen
│ └── chat_conversation.png # Screenshot — active conversation
├── requirements.txt # Python dependencies
├── .env # Environment config (not committed)
└── README.md
- Python 3.10+
- A Portkey API key with a configured virtual key
git clone git@github.com:harshitkumar454/FDE-Helper.git
cd FDE_helper
python -m venv venv
source venv/bin/activate
pip install -r requirements.txtCreate a .env file in the project root:
PORTKEY_API_KEY=your_portkey_api_key_here
MODEL=@your-virtual-key/provider/model-namepython app.pyOpen http://localhost:8080 in your browser.
| Method | Path | Description |
|---|---|---|
| GET | / |
Serves the chat UI |
| POST | /chat |
Streaming chat endpoint (SSE) |
| GET | /team |
Returns the knowledge base as JSON |
Request:
{
"messages": [
{ "role": "user", "content": "MCP server is down, who do I contact?" }
]
}Response (Server-Sent Events):
data: {"content": "You"}
data: {"content": " should"}
data: {"content": " contact"}
data: {"content": " Visarg..."}
data: [DONE]
Edit knowledge_base.json to add or update team members:
{
"name": "Person Name",
"role": "Their Role",
"description": "Natural language description of what they handle. BM25 matches against this text."
}Restart the server after editing — the BM25 index rebuilds automatically.
| Component | Technology |
|---|---|
| Backend | FastAPI + Uvicorn |
| Retrieval | BM25Okapi (rank-bm25) |
| LLM Gateway | Portkey AI |
| LLM Model | Llama 3.3 70B (via OpenRouter) |
| Frontend | Vanilla HTML/CSS/JS + SSE |
| Theme | Portkey Design System |
MIT

