Node.js agent chat with Weaviate RAG, Gemini LLM, and Chart.js tool. Supports streaming NDJSON output.
Prerequisites: Node.js 18+, Docker, Gemini API key
npm install
cp .env.example .env
# Edit .env and set GEMINI_API_KEYStart Weaviate:
docker compose up -dWait ~10 seconds for Weaviate to be ready.
Seed the database:
npm run seedRun the chat CLI:
npm run chatType your query, press Enter. Type exit to quit.
| Type | Query | Expected |
|---|---|---|
| Direct | Explain what LangGraph does. |
LLM answer, no RAG/chart |
| RAG | What does the safety guideline say about PPE for welding? |
Answer with citations (1- Page 7, 1- Page 8) |
| Chart | Generate a bar chart configuration. |
Chart.js config in data |
| Both | Based on the quality manual, summarize inspection sampling and return a chart config. |
RAG answer + Chart.js config |
Output is streamed as NDJSON, one JSON object per line: { "answer": "...", "data": [...] }.
┌─────────────────────────────────────────────────────────────┐
│ Delegating Agent │
│ (routes by keywords: direct | rag | chart | rag+chart) │
└─────────────────────────────────────────────────────────────┘
│ │ │
▼ ▼ ▼
┌────────┐ ┌─────────────┐ ┌──────────────────┐
│ LLM │ │ RAG Agent │ │ Chart.js Tool │
│ Gemini │ │ (Weaviate) │ │ (mock config) │
└────────┘ └─────────────┘ └──────────────────┘
│
▼
┌─────────────────────┐
│ Weaviate (Docker) │
│ QnAChunk, tenantA │
│ 3 seed documents │
└─────────────────────┘
- Delegating Agent: Parses the query, chooses direct answer, RAG, chart, or both.
- RAG Agent: Fetches chunks from Weaviate, formats references as
N- Page X, returns grounded answer. - Chart Tool: Returns a fixed Chart.js bar config (mock).
- Streaming: Each event has
answer(text) anddata(RAG refs and/or chart config).
app/src/
├── config/env.ts
├── weaviate/ # schema, seed, retrieval
├── llm/ # Gemini provider
├── tools/ # Chart.js mock
├── agents/ # delegating, RAG, graph
├── streaming/
└── cli/chat.ts
docker-compose.yml # Weaviate
docs/ # Implementation plan