sacred-roseS
Apify & Crawlee3y ago
3 replies
sacred-rose

IF a request times out, continue crawling

How to handle this scenario ? I have 100 requests, 2 fail after retrying but I want the crawler to not throw an error, thus failing. I want it to continue crawling the rest of the URL's.
Is it, with the help of failedRequestHandler ?
Thanks a lot !
Was this page helpful?