Problem
The cron loop fetches at most 10 pages per category (200 items). For large categories like Technology, Games, or Design this cap silently truncates results — there is no signal that more pages exist.
// cron.go
for page := 1; page <= 10; page++ {
campaigns, err := s.scrapingService.DiscoverCampaigns(catID, "newest", page)
if len(campaigns) == 0 {
break // only breaks on empty page, not on "last page"
}
}
Expected Behaviour
Either:
- Increase the cap to a higher value (e.g. 25 pages) for root categories, OR
- Make the depth configurable per category (larger categories get more pages), OR
- Respect an explicit
hasNextPage signal from the API response
Notes
- ScrapingBee cost for 25 pages × 15 categories = 375 credits/day — still < 12,000/month, well within the $49 plan
- The current
break on empty page is correct but does not distinguish between "no results" and a parsing failure
Problem
The cron loop fetches at most 10 pages per category (200 items). For large categories like Technology, Games, or Design this cap silently truncates results — there is no signal that more pages exist.
Expected Behaviour
Either:
hasNextPagesignal from the API responseNotes
breakon empty page is correct but does not distinguish between "no results" and a parsing failure