Tools for working with Dungeon Crawl Stone Soup (DCSS) data: parse morgue files, load them into PostgreSQL, analyze games, and browse results in a Next.js web app.
- Parse raw morgue
.txtfiles into structured JSON - Load parsed game data into a PostgreSQL schema built for analytics
- Run a web UI for player lookup, morgue breakdowns, records, and analytics
- Download morgues from scoring/streak pages for bulk ingestion
This repo uses pnpm workspaces.
crawl-crawler/
├── apps/
│ └── web/ # Next.js app (UI + API routes)
├── packages/
│ ├── dcss-morgue-parser/ # Morgue text -> structured data
│ ├── dcss-player-parser/ # Player page HTML -> structured data
│ ├── dcss-combo-records-parser/ # Combo records parser/CLI
│ ├── dcss-game-data/ # Static DCSS reference data
│ └── game-data-db/ # PostgreSQL schema + migrations
└── scripts/
├── streak-downloader/ # Download morgue files from streak pages
├── morgue-loader/ # Parse + load morgues into PostgreSQL
├── combo-records-updater/ # Refresh combo records JSON
└── morgue-parser-diagnostic/ # Investigate parser edge cases
- Node.js 18+
- pnpm 9+
- PostgreSQL 14+ (for analytics/database-backed features)
pnpm installCreate apps/web/.env with PostgreSQL connection details:
PGHOST=localhost
PGPORT=5432
PGDATABASE=crawl_crawler
PGUSER=your_username
PGPASSWORD=your_passwordIf you fetch morgues from underhound.eu, also set:
UNDERHOUND_BASIC_AUTH_USERNAME=underhound_username
UNDERHOUND_BASIC_AUTH_PASSWORD=underhound_passwordpnpm db:migrateThe /records page reads from apps/web/public/data/combo-records.json, so refresh that file after setup (and whenever you want newer records):
pnpm download:combo-recordspnpm devOpen http://localhost:3000.
This site uses Google Analytics to understand how people use the app and prioritize improvements. It is not used for ads or monetization.
Google Analytics is only enabled in production when NEXT_PUBLIC_GA_MEASUREMENT_ID is set.
It does not run in local development.
If your main question is "how do I actually populate the DB?", start here.
Before loading, you need a directory of morgue .txt files.
If you do not already have morgues on disk, run the downloader:
cd scripts/streak-downloader
python download_morgues.py "http://crawl.akrasiac.org/scoring/streaks.html"By default this writes files to scripts/streak-downloader/outputs.
See scripts/streak-downloader/README.md for options like sampling, output directory, and delay.
pnpm load:morgues <directory-with-morgue-txt-files>Example:
pnpm load:morgues scripts/streak-downloader/outputscd scripts/morgue-loader
pnpm generate-csv <morgue-directory> <output-directory>
psql -d crawl_crawler -f <output-directory>/load.sql
pnpm mark:streaks-updatedUse this method for large datasets because it is much faster than row-by-row inserts.
After CSV + COPY, run pnpm mark:streaks-updated so the About page shows the latest streak load date and analytics caches are invalidated.
Combo records are not stored in PostgreSQL. They are downloaded to a JSON file used by the web app.
pnpm download:combo-recordsThis updates apps/web/public/data/combo-records.json.
- Download morgues with
scripts/streak-downloader - Run
pnpm db:migrate - Load morgues via
pnpm load:morgues ...(or CSV + COPY for bulk) - If you used CSV + COPY, run
pnpm mark:streaks-updated - Run
pnpm download:combo-records - Start the app with
pnpm dev - Explore
/analyticsand/recordsin the web UI
# App
pnpm dev
pnpm build
pnpm lint
# Parser package
pnpm build:parser
pnpm test:parser
# Database
pnpm db:migrate
pnpm db:migrate:down
pnpm db:reset
pnpm load:morgues <dir>
pnpm mark:streaks-updated
# Utilities
pnpm diagnose:morgue
pnpm diagnose:morgue:verbose
pnpm download:combo-recordspackages/dcss-morgue-parser/README.md- Morgue parser library and CLIpackages/dcss-player-parser/README.md- Player page parserpackages/dcss-combo-records-parser/README.md- Combo records parserpackages/dcss-game-data/README.md- Static species/background/god/branch datapackages/game-data-db/README.md- DB schema, migrations, and query utilities
scripts/streak-downloader/README.md- Download morgue files from streak pagesscripts/morgue-loader/README.md- Parse and load morgues into PostgreSQLscripts/morgue-parser-diagnostic/README.md- Troubleshoot parser behaviorscripts/combo-records-updater/README.md- Update combo records data
- For bugs, features, and questions, open a GitHub issue using the provided templates.
- See
CONTRIBUTING.mdfor contribution workflow and expectations.