other-emerald•2y ago
Delay new requests
I get new requests from third party source and need to sleep crawler before got new request for it. How to realize it?
I tried to create requestQueue, then when I get new request I will add it to queue and run crawler again.
First I create queue and crawler and run it.
then crawler done work.
All requests from the queue have been processed, the crawler will shut down.
When I got new request I try to add it to queue and run crawler
but get message:
All requests from the queue have been processed, the crawler will shut down.
6 Replies
Are you sure you’ve written correctly ?
other-emeraldOP•2y ago
where could I have made a mistake?
Before I run crawler again, I check queue
and got
pendingRequestCount: 1
then I start crawler again
and got this:
INFO PlaywrightCrawler: Starting the crawler.
INFO PlaywrightCrawler: All requests from the queue have been processed, the crawler will shut down.@xsfunc just advanced to level 1! Thanks for your contributions! 🎉
They are to the same url ? They are post requests ?
You could probably use solution explained here: https://discord.com/channels/801163717915574323/1170295320161308722
Basically you keep the crawler running infinitely and wait for new requests to be processed immediately after being added to queue
other-emeraldOP•2y ago
Thank you, this solved my problem: