unwilling-turquoise•2y ago
Stoping Crawler when done in scraping
Good day everyone, How can I make the crawler stop? When its done to scraper/request a certain url?
Because I want to setup my crawlee project that continously running when it has no url to request, like waiting to a queue of urls (redis)
2 Replies
extended-salmon•2y ago
CriticalError | API | Crawlee
Errors of
CriticalError
type will shut down the whole crawler.unwilling-turquoiseOP•2y ago
How can I do that? throw an CriticlError even its not encountering error?