A data integration and analysis platform focused on real estate.
/backend: Contains the Python FastAPI backend application, background workers, and database models./frontend: Contains the Next.js (React/TypeScript) frontend application.
Ensure you have the following installed on your system:
- Node.js: (Specify version, e.g., v20.x or later)
- pnpm: (Specify version or
npm install -g pnpm) - Python: (Specify version, e.g., v3.11 or v3.12, matching
.python-versionif present) - Docker: Latest version
- Docker Compose: Latest version (often included with Docker Desktop)
- Database Client: (Optional, for direct DB access, e.g.,
psqlfor PostgreSQL)
Both frontend and backend might require environment variables.
- Backend:
- Navigate to the
backenddirectory:cd backend - Create a
.envfile by copying.env.example:cp .env.example .env - Crucially, fill in the required values in
.env, especially secrets like database passwords and API keys. Never commit your.envfile to Git.
- Navigate to the
- Frontend:
- Navigate to the
frontenddirectory:cd frontend - Create a
.env.localfile if needed (e.g., forNEXT_PUBLIC_API_URL). Copy from.env.exampleif one exists. Do not commit.env.localfiles containing secrets.
- Navigate to the
(See .env.example files in respective directories for required variables)
# Navigate to the backend directory
cd backend
# Create and activate a virtual environment (recommended)
python -m venv .venv
# Windows:
# .\ .venv\Scripts\activate
# macOS/Linux:
source .venv/bin/activate
# Install dependencies (uses pyproject.toml)
pip install -e .
# Or if requirements.txt is preferred:
# pip install -r requirements.txt
# Set up the database (Run Docker Compose first if DB is containerized)
# Example assuming Alembic migrations (add if you use Alembic)
# alembic upgrade head# Navigate to the frontend directory
cd frontend
# Install dependencies using pnpm
pnpm installYou typically need multiple terminals for development.
-
Start Docker Services (Database, Redis, etc.):
# From the 'backend' directory docker-compose up -d db redis # Add other services as needed
-
Start Backend API:
# From the 'backend' directory (with virtual env activated) uvicorn app.main:app --reload --host 0.0.0.0 --port 8000(The API will be accessible at http://localhost:8000)
-
Start Celery Worker (if needed for background tasks):
# From the 'backend' directory (with virtual env activated) celery -A app.workers.celery_app worker --loglevel=info -
Start Celery Beat (if using scheduled tasks):
# From the 'backend' directory (with virtual env activated) celery -A app.workers.celery_app beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler # Or if not using DB scheduler: # celery -A app.workers.celery_app beat --loglevel=info
-
Start Frontend:
# From the 'frontend' directory pnpm dev(The frontend will be accessible at http://localhost:3000)
# Navigate to the 'backend' directory (or project root if compose file is there)
cd backend
# Build the Docker images
docker-compose build
# Start all services in detached mode
docker-compose up -d
# To stop the services
docker-compose downPython code doesn't typically require a separate build step unless you are creating a distributable package. Docker handles the "build" in the context of creating an image.
# Navigate to the frontend directory
cd frontend
# Create a production build
pnpm buildThis will create an optimized build in the .next directory. Use pnpm start to run this build.
# Navigate to the backend directory (with virtual env activated)
pytest(Ensure test database is configured and any necessary test setup is performed)
# Navigate to the frontend directory
pnpm test(Add specific test commands if using Jest, Cypress, etc.)
(Keep existing instructions, ensure .env.example is mentioned)
uv is a fast Python package installer and resolver, usable as a drop-in replacement for pip and venv.
-
Install
uv: (If not already installed)# Using pipx (recommended) pipx install uv # Or using pip pip install uv
-
Backend Setup (using uv):
# Navigate to the backend directory cd backend # Create virtual environment (uv handles this implicitly often, # but explicit creation is good practice for clarity) uv venv .venv # Activate it # Windows: .\.venv\Scripts\activate # macOS/Linux: source .venv/bin/activate # Install dependencies from pyproject.toml (includes editable install) uv pip install -e . # Install development dependencies (if defined in pyproject.toml under [project.optional-dependencies]) # Example: uv pip install -e .[dev] # Initialize/Upgrade Database (using Alembic) # Ensure DB service is running (e.g., via docker-compose up db) # Initialize Alembic (only once per project): # alembic init alembic # Configure alembic.ini and alembic/env.py (see Alembic docs) # Generate a new migration script based on model changes: # alembic revision --autogenerate -m "Describe your change here" # Apply migrations to the database: alembic upgrade head
-
Frontend Setup: (Keep existing
pnpm installinstructions)
-
Start Docker Services (DB, Redis):
# From the 'backend' directory docker-compose up -d db redis -
Activate Backend Environment:
# From the 'backend' directory # Windows: .\.venv\Scripts\activate # macOS/Linux: source .venv/bin/activate
-
Run Backend API (using uv):
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000 # Or run via uv directly (less common for servers, more for scripts): # uv run uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
-
Run Celery Worker (using uv):
celery -A app.workers.celery_app worker --loglevel=info # Or via uv: # uv run celery -A app.workers.celery_app worker --loglevel=info
-
Run Celery Beat (using uv, if needed):
celery -A app.workers.celery_app beat --loglevel=info --scheduler django_celery_beat.schedulers:DatabaseScheduler # Or without scheduler # Or via uv: # uv run celery -A app.workers.celery_app beat --loglevel=info --scheduler ...
-
Run Frontend: (Keep existing
pnpm devinstructions)
# Navigate to the backend directory
cd backend
# Activate environment
# Windows: .\.venv\Scripts\activate
# macOS/Linux: source .venv/bin/activate
# Run tests using uv (finds pytest if installed)
uv run pytest
# Or explicitly:
# pytest
## Linting and Formatting
### Backend
```bash
# Navigate to the backend directory (with virtual env activated)
# Check formatting
black --check .
# Format code
black .
# Check linting/imports
ruff check .
# Fix linting/imports where possible
ruff check . --fix# Navigate to the frontend directory
# Check linting
pnpm lint
# Check/Apply formatting (assuming Prettier is setup)
pnpm prettier --check .
pnpm prettier --write .- Backend: Can be deployed using Docker containers on services like AWS ECS, Google Cloud Run, Kubernetes, etc. Ensure environment variables are securely managed.
- Frontend: The Next.js application can be easily deployed to platforms like Vercel (recommended), Netlify, AWS Amplify, or self-hosted using Node.js or Docker.