rival-black•3y ago
Best practice to stop/crash the actor/crawler on high ratio of errors?
Following snippet works well for me, but it smells... sb have a cleaner approach?
3 Replies
There is now some message on apify which comes I guess from the crawler when there are problems. So maybe you can use that if you find out what is generating that message.

rival-blackOP•3y ago
This @HonzaS guy knows stuff 🙏
you can use stats https://crawlee.dev/api/browser-crawler/class/BrowserCrawler#stats however approach itself is not safe - you supposed to handle sessions and-or bot protection to resolve blocking by logic, not by hammering web site doing many runs. I.e. set concurrency, max request retries, logic for session.markBad etc and implement scalable crawler.