Simple, config-driven lunch menu collector for nearby restaurants.
- Scrapes each source website with
requests + BeautifulSoup. - Writes a single local snapshot file:
data/current_menu.json. - Keeps the last successful menu when a restaurant removes menu later in the day.
- Displays only current day from weekly menus.
config/restaurants.yamlsource list and stale policyscraper/update_menus.pyscraper and aggregatordata/current_menu.jsongenerated snapshot (overwritten each run)web/static UIscripts/update_menus.shcron-friendly runner
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
python3 scraper/update_menus.pyThen run a static server from repo root:
python3 -m http.server 8080Open http://localhost:8080/web/.
Edit config/restaurants.yaml and add a new entry:
- id: my-restaurant
name: My Restaurant
parser: formanka
url: https://example.com/menuAvailable parser values in this version:
formankatradicezlatyklas
If a new site has different HTML, add a parser function in scraper/update_menus.py and register it in parsers map.
Run every 30 minutes in business hours:
*/30 7-16 * * 1-5 cd /var/www/obidek && /bin/bash scripts/update_menus.sh >> /var/log/obidek-update.log 2>&1Optional safety refresh at 17:10:
10 17 * * 1-5 cd /var/www/obidek && /bin/bash scripts/update_menus.sh >> /var/log/obidek-update.log 2>&1Point web root to repository folder:
server {
listen 80;
server_name lunch.your-domain.tld;
root /var/www/obidekvs;
index web/index.html;
location / {
try_files $uri $uri/ /web/index.html;
}
location /data/ {
add_header Cache-Control "no-store";
}
}The scraper has a stale policy in config/restaurants.yaml:
stale_policy:
keep_last_successful: true
hold_after_hour: 15
max_age_hours: 36If a website becomes empty/unavailable in the afternoon, last successful menu is kept (status: stale-kept) so your display stays useful.