-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Context
The Disasters API is currently a CRUD store — it receives data via HTTP but does not autonomously ingest from official sources. The mobile app depends on fresh data from:
- fogos.pt — Portuguese wildfire data
- PROCIV — Portuguese civil protection alerts
- NASA FIRMS — Satellite fire detection
Without an ingestion worker, the API data is only as fresh as whatever manually pushes to it.
Proposed Solution
Create a scheduled worker (separate service or cron job) that:
- Polls official sources at regular intervals (e.g., every 5 minutes for active sources)
- Maps external data to the
disastersschema - Upserts into the API using the
(source, external_id)unique constraint to prevent duplicates - Updates status of existing records when source data changes (active → contained → resolved)
MVP Scope (Portugal only)
| Source | Endpoint | Frequency | Priority |
|---|---|---|---|
| fogos.pt | TBD (web scraping or API) | 5 min | P1 |
| PROCIV | TBD (RSS or API) | 5 min | P1 |
| NASA FIRMS | https://firms.modaps.eosdis.nasa.gov/api | 15 min | P2 |
Architecture Options
- Separate Node.js service with cron (e.g.,
node-cron) - Supabase Edge Function on a schedule
- GitHub Actions scheduled workflow (simplest for MVP)
Future: Global Expansion
Each country/region will need its own source adapters. The worker should be designed with a pluggable adapter pattern:
adapters/
fogos-pt.ts
prociv-pt.ts
nasa-firms.ts
# Future:
# copernicus-eu.ts
# cal-fire-us.ts
Labels
enhancement, infrastructure
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels