Skip to content

feat: make per-category page depth dynamic instead of hard-coded 10 #14

@Jing-yilin

Description

@Jing-yilin

Problem

The cron loop fetches at most 10 pages per category (200 items). For large categories like Technology, Games, or Design this cap silently truncates results — there is no signal that more pages exist.

// cron.go
for page := 1; page <= 10; page++ {
    campaigns, err := s.scrapingService.DiscoverCampaigns(catID, "newest", page)
    if len(campaigns) == 0 {
        break  // only breaks on empty page, not on "last page"
    }
}

Expected Behaviour

Either:

  1. Increase the cap to a higher value (e.g. 25 pages) for root categories, OR
  2. Make the depth configurable per category (larger categories get more pages), OR
  3. Respect an explicit hasNextPage signal from the API response

Notes

  • ScrapingBee cost for 25 pages × 15 categories = 375 credits/day — still < 12,000/month, well within the $49 plan
  • The current break on empty page is correct but does not distinguish between "no results" and a parsing failure

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions