Apify & CrawleeA&CApify & Crawlee
Powered by
verbal-limeV
Apify & Crawlee•3y ago•
9 replies
verbal-lime

Named Request Queues not getting purged

Hey folks,

I am explicitly creating request queues for my crawlers to make sure that crawler specific run options such as
maxRequestsPerCrawl
maxRequestsPerCrawl
can be set on a crawler to crawler basis, but the issue with this approach is that the request queues are not getting purged after every crawl, resulting in the crawlers resuming the session from before. These are the approaches I tried

1. I have tried setting the option
purgeRequestQueue
purgeRequestQueue
to true explicitly in the
crawler.run()
crawler.run()
func but it results in this error
Did not expect property 
Did not expect property 
purgeRequestQueue
 to exist, got 
 to exist, got 
true
 in object 
 in object 
options
`

2. setting it as a global variable in 
`

2. setting it as a global variable in 
crawlee.json
 (it looks like crawlee is not picking up my crawlee.json file at all, because I tried to set logging levels in it and crawlee didnt pick it up).
3. tried using 
 (it looks like crawlee is not picking up my crawlee.json file at all, because I tried to set logging levels in it and crawlee didnt pick it up).
3. tried using 
await purgeDefaultStorages()` in my entry file

None of these options are working, is there some other way to purge these queues? I know its set by default to purge them but its not working for my named queues.

Also, is using queues the best way to isolate crawler specific options for each crawler? because when I used the default queue and restricted crawls to some numeric value in one crawler, and when it shut down after reaching that value, all the other crawlers would also shut down logging that max requests per crawl has been reached despite me not having specified this option when I initialized the crawlers.
Apify & Crawlee banner
Apify & CrawleeJoin
This is the official developer community of Apify and Crawlee.
14,091Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
ellativity

**Update to Store Publishing Terms and Acceptable Use Policy** Due to an influx of fraudulent reviews recently, Apify's Legal team has taken some actions to protect developers, customers, and Apify, by updating the Store Publishing Terms and Acceptable Use Policy. Please pay special attention to the updated terms in section 4 of the Store Publishing Terms here: https://docs.apify.com/legal/store-publishing-terms-and-conditions Additionally, please review the changes to section 2 of the Acceptable Use Policy here: https://docs.apify.com/legal/acceptable-use-policy If you have any questions, please ask them in <#1206131794261315594> so everyone can see the discussion. Thanks!

ellativity · 3w ago

ellativity

Hi @everyone I'm hanging out with the Creator team at Apify in https://discord.com/channels/801163717915574323/1430491198145167371 if you want to discuss Analytics and Insights!

ellativity · 4w ago

ellativity

2 things for <@&1092713625141137429> members today: 1. The Apify developer rewards program is open for registrations: https://apify.notion.site/developer-rewards This is the program where you will earn points for marketing activities. The rewards are still TBC, but the real purpose of the program is to help you structure your marketing activities and efforts. In the coming weeks, I will be populating that link with guides to help you identify the best ways to market your Actors, as well as scheduling workshops and office hours to help you create content and develop your own marketing strategy. 2. At 2PM CET (in about 80 minutes) there will be an office hour with the team behind Insights and Analytics, who want your feedback on how to improve analytics for you. Join us in https://discord.com/channels/801163717915574323/1430491198145167371 to share your ideas!

ellativity · 4w ago

Similar Threads

deleting request queues
embarrassing-maroonEembarrassing-maroon / crawlee-js
3y ago
Multiple queues
hurt-tomatoHhurt-tomato / crawlee-js
3y ago
Maximum urls to crawl from a named request queue
dead-brownDdead-brown / crawlee-js
3y ago
No such file or directory storage/request_queues/default/JoxD7mAqj47ssmS.json
colossal-harlequinCcolossal-harlequin / crawlee-js
3y ago