An AI-powered tool that helps you understand any codebase by asking questions in plain English.
Connect a GitHub repo, and Repono will parse the entire codebase, let you browse files, visualize dependencies, and chat with an AI that actually reads your code before answering.
- Chat with your code — Ask things like "explain the authentication flow" or "find potential bugs" and get answers based on the actual source code, not generic responses.
- File browser — Browse the full repo tree, click any file to see its source with syntax highlighting and stats.
- Code map — Interactive dependency graph showing how directories and modules connect to each other.
- Insights — Auto-detects the tech stack, frameworks, entry points, and flags potential issues like hardcoded secrets or TODO comments.
- You paste a GitHub repo URL
- Repono clones it, parses every file, and chunks the code into searchable pieces
- When you ask a question, it finds the most relevant code chunks using keyword search
- Those chunks get sent to Llama 3.3 70B (via Groq) which generates an explanation based on your actual code
- Sources are shown alongside the answer so you can verify everything
- Frontend — React, Vite
- Backend — Node.js, Express
- AI — Groq API with Llama 3.3 70B
- Code parsing — Custom chunker that splits code by functions, classes, and logical blocks
- Search — Keyword-based retrieval with code-aware scoring
git clone https://github.com/SE7EN2028/Repono.git
cd Repono
npm install
cd client && npm install && npm run build && cd ..Create a .env file in the root:
GROQ_API_KEY=your_groq_api_key_here
PORT=3001
Get a free Groq API key at console.groq.com/keys
Start the server:
npm startOpen http://localhost:3001 for the live demo in your local device!
- Push to GitHub
- Create a new Web Service on render.com
- Connect the repo
- Build command:
npm install && npm run build - Start command:
npm start - Add
GROQ_API_KEYin environment variables
├── server/
│ ├── index.js # Express server, serves API + frontend
│ ├── routes/
│ │ ├── repo.js # Clone, index, files, insights, deps endpoints
│ │ └── query.js # Question answering endpoint
│ └── services/
│ ├── repoManager.js # Git clone and repo management
│ ├── fileParser.js # Walks repo and reads source files
│ ├── chunker.js # Splits code into function-level chunks
│ ├── keywordSearch.js # Finds relevant code by keyword matching
│ ├── ragPipeline.js # Sends code + question to Groq LLM
│ ├── queryClassifier.js # Classifies question type
│ ├── insightGenerator.js # Scans for tech stack, issues, entry points
│ ├── dependencyAnalyzer.js # Builds import-based dependency graph
│ └── webSearch.js # Wikipedia/DuckDuckGo lookup for concepts
├── client/
│ └── src/
│ ├── App.jsx # Main app shell and state management
│ ├── api.js # Backend API client
│ └── components/ # All UI components
└── package.json
- No conversation memory — each question is independent
- Context window limits how much code the AI sees per question
- Private repos need a GitHub token (not implemented yet)
- Chat history doesn't persist after refresh
MIT