quickest-silver•3y ago
I have 99 urls in the queue. But scraper finishes crawl after a few urls, why?
The scraper finishes crawl after a few ones everytime. I have 99 urls added to the queue.
This is my config:


8 Replies
quickest-silverOP•3y ago
I add urls like this:
@Nisthar just advanced to level 2! Thanks for your contributions! 🎉
quickest-silverOP•3y ago
I can see the urls added inside the queue
maybe urls are not unique
quickest-silverOP•3y ago
Ok that was the issue i think
Thanks 👍
rare-sapphire•3y ago
@HonzaS what if I want to scrap the same url again and again ? do I need to re-initialize the crawler ?
Then you need to have different uniqueKey for each request. By default uniqueKey is the same as url but you can set your own.
https://crawlee.dev/api/core/class/Request#uniqueKey
rare-sapphire•3y ago
Thankyou very much @HonzaS