xenial-black•3y ago
Retry using the browser
How to make it so that first try to scrap using CheerioCrawler and if the response is 403 or 401 then try PuppeteerCrawler again.
5 Replies
The easiest is probably just to push the failed requests to an array on the side and then run the PupppeteerCrawler after. You can have more crawlers inside single script
xenial-blackOP•3y ago
How push the failed requests?
@Romja just advanced to level 2! Thanks for your contributions! 🎉
xenial-blackOP•3y ago
How do you like the idea of doing this?
The best practice is to just throw an error and the crawler will retry the whole request