Open-source ASPM (Application Security Posture Management) platform built with Django. Aggregates SAST, SCA, DAST, container, and malware scan results; normalizes and deduplicates findings; enriches them with threat intel; manages SBOMs; and provides dashboards, risk scoring, and CI/CD integrations for full AppSec visibility.
- 🔍 Multi-Scanner Support: Aggregates results from Trivy, Grype, Snyk, and more
- 📊 Unified Dashboard: Centralized view of all security findings across products
- 🔄 Deduplication: Hash-based fingerprinting to track vulnerabilities across scans
- 🎯 Threat Intelligence: Automatic enrichment with EPSS scores and CISA KEV data
- 📦 SBOM Management: Upload, parse, and export Software Bill of Materials (CycloneDX)
- 🚀 REST API: Full API with Swagger documentation for CI/CD integration
- 🔐 API Token Authentication: Secure token-based API access with revocation
- 📈 Vulnerability Status Management: Track findings as Active, Fixed, Risk Accepted, False Positive, or Duplicate
- ⚡ Async Processing: Background task processing with Celery for scalability
- 🎨 Modern UI: Clean, responsive interface built with Tailwind CSS
- Backend: Django 5.2 + Django REST Framework
- Task Queue: Celery with Redis broker
- Database: PostgreSQL (production) / SQLite (development)
- API Documentation: drf-spectacular (Swagger/OpenAPI)
- Frontend: Django Templates + Tailwind CSS
- Python 3.10+
- Redis 6.0+ (for Celery task queue)
- PostgreSQL 12+ (for production)
- pip and virtualenv
🚀 One-Command Setup (No configuration needed!):
Quick Start:
# That's it! No .env file needed
docker-compose -f docker-compose.simple.yml up -d --build
# Create superuser
docker-compose -f docker-compose.simple.yml exec web python manage.py createsuperuser
# Access at http://localhost:8000✨ What's configured automatically:
- PostgreSQL database (wellq/wellq/wellq_dev_password)
- Redis for Celery
- All Django settings
- No manual configuration needed!
For advanced Docker configuration, see the docker-compose files in the repository.
git clone <repository-url>
cd WellQpython -m venv venv
# On Windows
venv\Scripts\activate
# On Linux/Mac
source venv/bin/activatepip install -r requirements.txtCreate a .env file in the project root:
# Django Settings
SECRET_KEY=your-secret-key-here
DEBUG=True
ALLOWED_HOSTS=localhost,127.0.0.1
# Database (SQLite for development)
# For production, use PostgreSQL:
# DATABASE_URL=postgresql://user:password@localhost:5432/wellq
# Celery/Redis
CELERY_BROKER_URL=redis://localhost:6379/0
CELERY_RESULT_BACKEND=redis://localhost:6379/0# Run migrations
python manage.py migrate
# Create superuser
python manage.py createsuperuserWindows:
- Download Redis from: https://github.com/microsoftarchive/redis/releases
- Or use WSL2 with Redis
- Or use Docker:
docker run -d -p 6379:6379 redis:latest
Linux:
sudo apt-get install redis-server
sudo systemctl start redisMac:
brew install redis
brew services start redisDocker (Recommended):
docker run -d -p 6379:6379 --name redis redis:latestYou need to run 3 processes simultaneously:
python manage.py runserverAccess the application at: http://localhost:8000
celery -A core worker -l infoThis processes background tasks (scan uploads, SBOM parsing, etc.)
celery -A core beat -l infoThis runs scheduled tasks (daily threat intel enrichment)
Create service files:
/etc/systemd/system/wellq-celery.service
[Unit]
Description=WellQ Celery Worker
After=network.target redis.service
[Service]
Type=forking
User=www-data
Group=www-data
WorkingDirectory=/path/to/WellQ
Environment="PATH=/path/to/venv/bin"
ExecStart=/path/to/venv/bin/celery -A core worker --loglevel=info --logfile=/var/log/celery/worker.log --pidfile=/var/run/celery/worker.pid --detach
ExecStop=/bin/kill -s TERM $MAINPID
Restart=always
[Install]
WantedBy=multi-user.target/etc/systemd/system/wellq-celery-beat.service
[Unit]
Description=WellQ Celery Beat Scheduler
After=network.target redis.service
[Service]
Type=forking
User=www-data
Group=www-data
WorkingDirectory=/path/to/WellQ
Environment="PATH=/path/to/venv/bin"
ExecStart=/path/to/venv/bin/celery -A core beat --loglevel=info --logfile=/var/log/celery/beat.log --pidfile=/var/run/celery/beat.pid --detach
ExecStop=/bin/kill -s TERM $MAINPID
Restart=always
[Install]
WantedBy=multi-user.targetEnable and start services:
sudo systemctl enable wellq-celery.service
sudo systemctl enable wellq-celery-beat.service
sudo systemctl start wellq-celery.service
sudo systemctl start wellq-celery-beat.serviceGunicorn:
pip install gunicorn
gunicorn core.wsgi:application --bind 0.0.0.0:8000 --workers 4Nginx Configuration:
server {
listen 80;
server_name your-domain.com;
location / {
proxy_pass http://127.0.0.1:8000;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
location /static/ {
alias /path/to/WellQ/core/static/;
}
location /media/ {
alias /path/to/WellQ/media/;
}
}Docker Compose configuration is included in the repository.
Quick Start:
cp .env.example .env
# Edit .env file
docker-compose up -d --build
docker-compose exec web python manage.py createsuperuserThe Docker setup includes:
- PostgreSQL database
- Redis for Celery
- Django web application (Gunicorn)
- Celery worker
- Celery beat scheduler
Update core/settings.py or use environment variable:
import dj_database_url
DATABASES = {
'default': dj_database_url.config(
default=os.getenv('DATABASE_URL', 'sqlite:///db.sqlite3')
)
}Configured in core/settings.py:
- Broker: Redis (default:
redis://localhost:6379/0) - Result Backend: Redis (default:
redis://localhost:6379/0) - Task Timeout: 30 minutes
- Scheduled Tasks: Daily threat intel enrichment at 2 AM
# Collect static files for production
python manage.py collectstatic --noinputOnce the server is running:
- Swagger UI: http://localhost:8000/api/swagger/
- ReDoc: http://localhost:8000/api/redoc/
- OpenAPI Schema: http://localhost:8000/api/schema/
- Create an API token via the web UI:
/profile/api-tokens/create/ - Use the token in API requests:
curl -H "Authorization: Token your-token-here" \
-F "workspace_id=..." \
-F "product_name=..." \
-F "release_name=..." \
-F "scanner_name=Trivy" \
-F "scan_file=@scan.json" \
http://localhost:8000/api/v1/scans/upload/POST /api/v1/scans/upload/- Upload scan results (async)POST /api/v1/sbom/upload/- Upload SBOM file (async)GET /api/v1/releases/{id}/sbom/export/- Export SBOMGET /api/v1/findings/- List findings with filtersGET /api/v1/workspaces/- List workspacesGET /api/v1/products/- List products
# Manual enrichment (also runs automatically via Celery Beat)
python manage.py enrich_dbpython manage.py test# Install black and flake8
pip install black flake8
# Format code
black .
# Lint code
flake8 .# Create migrations
python manage.py makemigrations
# Apply migrations
python manage.py migrateInstall Flower for Celery monitoring:
pip install flower
celery -A core flowerAccess at: http://localhost:5555
# Check worker status
celery -A core inspect active
# Check scheduled tasks
celery -A core inspect scheduled- Check Redis is running:
redis-cli ping(should returnPONG) - Check Celery broker URL in settings
- Check logs:
celery -A core worker -l debug
- Verify worker is running:
celery -A core inspect active - Check Redis connection
- Review worker logs for errors
- Verify database credentials in
.env - Check database server is running
- Run migrations:
python manage.py migrate
- Never commit
.envfile to version control - Use strong
SECRET_KEYin production - Set
DEBUG=Falsein production - Use HTTPS in production
- Regularly rotate API tokens
- Keep dependencies updated:
pip list --outdated
For 100+ concurrent users:
- Database: Use PostgreSQL with connection pooling
- Caching: Configure Redis caching (Django cache framework)
- Static Files: Serve via CDN or Nginx
- Celery Workers: Scale horizontally (multiple workers)
- Database Indexes: Ensure indexes on frequently queried fields
- Fork the repository
- Create a feature branch
- Make your changes
- Submit a pull request
See LICENSE file for details.
For issues and questions:
- Open an issue on GitHub
- Check the API documentation at
/api/swagger/
Built with ❤️ for the security community