Skip to content

LocalMedAI is a cutting-edge medical AI assistant that runs completely locally on your machine. Built with privacy-first principles, it processes medical images and symptom descriptions using local LLMs (Llama2/Mistral via Ollama) without sending any data to external servers.

Notifications You must be signed in to change notification settings

AttiR/LocalMedAI

Repository files navigation

🏥 LocalMedAI - Privacy-First Medical AI Assistant

Medical AI Assistant Privacy First Completely Free Local LLM

Overview

LocalMedAI is a cutting-edge medical AI assistant that runs completely locally on your machine. Built with privacy-first principles, it processes medical images and symptom descriptions using local LLMs (Llama2/Mistral via Ollama) without sending any data to external servers.

Technology Stack

Backend

  • FastAPI - Modern, fast Python web framework
  • Ollama - Local LLM deployment (Llama2-7B/Mistral-7B)
  • OpenCV - Medical image processing
  • PIL/Pillow - Image handling
  • Pydantic - Data validation

Frontend

  • React 18 with TypeScript
  • Tailwind CSS - Modern, responsive design
  • Vite - Fast build tool
  • Axios - HTTP client
  • React Query - State management

Infrastructure

  • Docker - Containerization
  • Docker Compose - Multi-container setup

Quick Start

Prerequisites

  • Docker and Docker Compose (for containerized setup)
  • OR Python 3.11+ and Node.js 16+ (for local development)
  • 8GB+ RAM (for local LLM)
  • Modern web browser

Option 1: One-Command Setup (Recommended)

# Clone the repository
git clone https://github.com/yourusername/LocalMedAI.git
cd LocalMedAI

# Run automated setup
chmod +x setup.sh && ./setup.sh

Option 2: Docker Compose

# Clone and start services
git clone https://github.com/yourusername/LocalMedAI.git
cd LocalMedAI
docker-compose up -d

# Access the application
# Frontend: http://localhost:3000
# Backend API: http://localhost:8000
# API Docs: http://localhost:8000/docs

Option 3: Local Development (With Virtual Environment)

# Clone repository
git clone https://github.com/yourusername/LocalMedAI.git
cd LocalMedAI

# Setup local development environment
chmod +x dev-setup.sh && ./dev-setup.sh

# Start services manually
./start-ollama.sh    # Terminal 1
./start-backend.sh   # Terminal 2
./start-frontend.sh  # Terminal 3

Python Virtual Environment

For local development, the project uses Python virtual environments:

# Setup virtual environment
cd backend
python3 -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate
pip install -r requirements.txt

# Activate when working
source venv/bin/activate
# Deactivate when done
deactivate

Node.js Environment

# Setup Node environment
cd frontend
npm install

# Development commands
npm run dev      # Start development server
npm run build    # Build for production
npm run lint     # Run linting

Available Scripts

Docker Scripts

./setup.sh      # Complete automated setup
./start.sh       # Start all services
./stop.sh        # Stop all services
./status.sh      # Check service status

Development Scripts

./dev-setup.sh   # Setup local development environment
./start-backend.sh   # Start backend with virtual environment
./start-frontend.sh  # Start frontend development server
./start-ollama.sh    # Start Ollama service

Manual Commands

# Docker
docker-compose up -d              # Start all services
docker-compose down               # Stop all services
docker-compose logs -f            # View logs
docker-compose restart backend    # Restart specific service

# Backend (with virtual environment)
cd backend && source venv/bin/activate
uvicorn app.main:app --reload     # Start backend

# Frontend
cd frontend
npm run dev                      # Start development server
npm run build                    # Build for production

# Ollama
ollama serve                      # Start Ollama service
ollama pull llama2:7b            # Download model
ollama list                      # List available models

Usage

Medical Image Analysis

  1. Upload medical images (dermatology, X-rays, etc.)
  2. AI analyzes the image for potential conditions
  3. Receive suggestions with confidence scores

Symptom Checker

  1. Describe your symptoms in natural language
  2. AI processes the description
  3. Get potential diagnosis suggestions

Privacy Assurance

  • ✅ No data leaves your machine
  • ✅ No external API calls
  • ✅ Complete local processing
  • ✅ Optional local storage only

Project Structure

LocalMedAI/
├── backend/                 # FastAPI backend
│   ├── app/
│   │   ├── __init__.py
│   │   ├── main.py         # FastAPI application
│   │   ├── models/         # Pydantic models
│   │   ├── services/       # Business logic
│   │   ├── utils/          # Utility functions
│   │   └── config.py       # Configuration
│   ├── requirements.txt
│   └── Dockerfile
├── frontend/               # React frontend
│   ├── src/
│   │   ├── components/     # React components
│   │   ├── pages/          # Page components
│   │   ├── services/       # API services
│   │   ├── types/          # TypeScript types
│   │   └── utils/          # Utility functions
│   ├── package.json
│   └── Dockerfile
├── docker-compose.yml      # Multi-container setup
├── .gitignore
└── README.md

Configuration

Environment Variables

# Backend
OLLAMA_BASE_URL=http://localhost:11434
MODEL_NAME=llama2:7b
UPLOAD_DIR=./uploads
MAX_FILE_SIZE=10485760  # 10MB

# Frontend
VITE_API_URL=http://localhost:8000

Demo Scenarios

  1. Skin Condition Analysis

    • Upload dermatology images
    • Get AI-powered condition suggestions
  2. Symptom Description

    • "I have a persistent cough and chest pain"
    • Receive potential diagnosis suggestions
  3. Medical History Analysis

    • Input patient information
    • Get AI insights and recommendations

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.

⚠️ Disclaimer

Important: This application is for educational and demonstration purposes only. It is not intended to replace professional medical advice, diagnosis, or treatment. Always consult with qualified healthcare professionals for medical concerns.

About

LocalMedAI is a cutting-edge medical AI assistant that runs completely locally on your machine. Built with privacy-first principles, it processes medical images and symptom descriptions using local LLMs (Llama2/Mistral via Ollama) without sending any data to external servers.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published