correct-apricot
correct-apricot3y ago

Stoping Crawler when done in scraping

Good day everyone, How can I make the crawler stop? When its done to scraper/request a certain url? Because I want to setup my crawlee project that continously running when it has no url to request, like waiting to a queue of urls (redis)
2 Replies
pleasant-yellow
pleasant-yellow3y ago
Perhaps you throw Crawlee CriticalError which will cause the crawler to shutdown.
CriticalError | API | Crawlee
Errors of CriticalError type will shut down the whole crawler.
correct-apricot
correct-apricotOP3y ago
How can I do that? throw an CriticlError even its not encountering error?

Did you find this page helpful?