Skip to content

kactlabs/code-info

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code Info

Upload SQL/Python files and chat with your code using AI. Ask questions, get explanations, and understand your codebase better.

Features

  • 📁 Upload single or multiple code files (.py, .sql, .js, .ts, .java, .c, .cpp, .h, .hpp)
  • 📦 Upload zip files containing multiple code files
  • 💬 Chat with your code using AI
  • 🤖 Multiple LLM providers (Ollama, OpenAI, Gemini, llama.cpp)
  • 🌙 Dark/Light theme toggle
  • 📱 Responsive design
  • ⚡ Vector-based code retrieval for accurate answers

Quick Start

  1. Install dependencies:
pip install -r requirements.txt
  1. Configure environment variables:
cp .env.sample .env
# Edit .env with your preferred LLM provider
  1. Run the application:
python app.py
  1. Open http://localhost:8014 in your browser

LLM Provider Configuration

Edit .env to select your LLM provider:

Ollama (Default)

LLM_PROVIDER=ollama
OLLAMA_MODEL=llama3.2:3b

Requires: Ollama running locally

OpenAI

LLM_PROVIDER=openai
OPENAI_API_KEY=your-api-key
OPENAI_MODEL=gpt-4o-mini

Google Gemini

LLM_PROVIDER=gemini
GOOGLE_API_KEY=your-api-key
GEMINI_MODEL=gemini-1.5-flash

llama.cpp

LLM_PROVIDER=llama.cpp
LLAMA_CPP_URL=http://127.0.0.1:8080/v1

Requires: llama.cpp server running

How It Works

  1. Upload code files individually or as a zip archive
  2. Files are processed and split into chunks
  3. Embeddings are created for semantic search
  4. Ask questions about your code
  5. The AI retrieves relevant code sections and provides answers

Supported File Types

  • Python (.py)
  • SQL (.sql)
  • JavaScript (.js)
  • TypeScript (.ts)
  • Java (.java)
  • C/C++ (.c, .cpp, .h, .hpp)
  • Zip archives (.zip) containing any of the above

About

Chat with your code

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors