spotty-amberS
Apify & Crawlee3y ago
1 reply
spotty-amber

Concurrent crawlers or maxRequests per Queue?

I'm crawling many websites everyday.
The ideal approach for me would be to have a maxRequestsPerMinute dependent on the website.
That way I'd have the crawler going at full speed, but crawling different pages from different websites in order to not surpass the websites request limit.

I don't think that's possible though. So how could I achieve this?
Should I run many crawlers at the same time? The problem with that is that it will require much more memory capacity, right?
Was this page helpful?