sacred-roseS
Apify & Crawlee3y ago
1 reply
sacred-rose

Prevent Clawler from adding failed request to default RequestQueue

Is there a way to prevent the crawler from adding a failed request to the default RequestQueue?

const crawler = new PuppeteerCrawler({
    proxyConfiguration,
    requestHandler: router,
    maxRequestRetries: 25,
    requestList: await RequestList.open(null, [initUrl]),
    requestHandlerTimeoutSecs: 2000,
    maxConcurrency: 1,
}, config);

I'm using the default RequestQueue to add productUrls, and they're being handled inside the defaultRequestHandler, but when some of them fails, I purposely throw an Error, expecting the failed request(which is the initUrl) goes back to RequestList, but it goes to the default RequestQueue too, which is not what I want.
Was this page helpful?