Apify & CrawleeA&CApify & Crawlee
Powered by
brilliant-limeB
Apify & Crawlee•2y ago•
4 replies
brilliant-lime

How to throttle enqueuing urls to next router

        splitAndExecute({
            callback: async (urlBatch, batchIndex) => {
                // log that we are enqueuing the nth batch of preview jobs from job-id etc
                logger.info(`Linkedin/Scraper - Enqueuing ${urlBatch.length} jobs to detail page handler - Batch ${batchIndex + 1}`);
                await enqueueLinks({
                    urls: urlBatch,
                    label: LinkedinRouterLabels.JOB_DETAIL_PAGE,
                    userData: createLinkedinRouterUserData(payload),
                    waitForAllRequestsToBeAdded: false
                });
                const minSleepTime = 2000 * (batchIndex + 1);
                const maxSleepTime = 3000 * (batchIndex + 1);
                await random_sleep(minSleepTime, maxSleepTime);

            },
            urls: jobDetailPageUrls,
            maxRequestsPerBatch: 2,
        });
        splitAndExecute({
            callback: async (urlBatch, batchIndex) => {
                // log that we are enqueuing the nth batch of preview jobs from job-id etc
                logger.info(`Linkedin/Scraper - Enqueuing ${urlBatch.length} jobs to detail page handler - Batch ${batchIndex + 1}`);
                await enqueueLinks({
                    urls: urlBatch,
                    label: LinkedinRouterLabels.JOB_DETAIL_PAGE,
                    userData: createLinkedinRouterUserData(payload),
                    waitForAllRequestsToBeAdded: false
                });
                const minSleepTime = 2000 * (batchIndex + 1);
                const maxSleepTime = 3000 * (batchIndex + 1);
                await random_sleep(minSleepTime, maxSleepTime);

            },
            urls: jobDetailPageUrls,
            maxRequestsPerBatch: 2,
        });


Hello guys. I have a router to scrape the url list and enqueue them to the next router.
However, I want to limit the enqueuing to throttle the request to the website.

I've tried add the crawler configuration but it doesn't work as intended. Even when I have a limit of request/min or request/crawl etc. it doesn't respect that.
Inititially I thought that its because the checking of limit is done only after a certain url-list is enqueued. And if a person enqueus a list bigger than a limit in the first go, then this could be the reason of limits not taking effect.
E.g. if limit is of 10 requests, and I enqueue the 25 request as a single array.

So I manually split the job-urls array into mulitple smaller batches.
However, this does not work as well. I mean the enqueuing is definitely done with the intervals of sleep, but the next router is still called at once after all the batches are enqueued.
Apify & Crawlee banner
Apify & CrawleeJoin
This is the official developer community of Apify and Crawlee.
13,923Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
ellativity

**Update to Store Publishing Terms and Acceptable Use Policy** Due to an influx of fraudulent reviews recently, Apify's Legal team has taken some actions to protect developers, customers, and Apify, by updating the Store Publishing Terms and Acceptable Use Policy. Please pay special attention to the updated terms in section 4 of the Store Publishing Terms here: https://docs.apify.com/legal/store-publishing-terms-and-conditions Additionally, please review the changes to section 2 of the Acceptable Use Policy here: https://docs.apify.com/legal/acceptable-use-policy If you have any questions, please ask them in <#1206131794261315594> so everyone can see the discussion. Thanks!

ellativity · 3w ago

ellativity

Hi @everyone I'm hanging out with the Creator team at Apify in https://discord.com/channels/801163717915574323/1430491198145167371 if you want to discuss Analytics and Insights!

ellativity · 4w ago

ellativity

2 things for <@&1092713625141137429> members today: 1. The Apify developer rewards program is open for registrations: https://apify.notion.site/developer-rewards This is the program where you will earn points for marketing activities. The rewards are still TBC, but the real purpose of the program is to help you structure your marketing activities and efforts. In the coming weeks, I will be populating that link with guides to help you identify the best ways to market your Actors, as well as scheduling workshops and office hours to help you create content and develop your own marketing strategy. 2. At 2PM CET (in about 80 minutes) there will be an office hour with the team behind Insights and Analytics, who want your feedback on how to improve analytics for you. Join us in https://discord.com/channels/801163717915574323/1430491198145167371 to share your ideas!

ellativity · 4w ago

Similar Threads

enqueueLinks with urls don't trigger router handler
FoudreTowerFFoudreTower / crawlee-js
10mo ago
Throttle on 429 responses
DavidoDDavido / crawlee-js
10mo ago
How to aggregate user data from duplicate URLs
vwkdVvwkd / crawlee-js
4mo ago
Proxy URLs
still-limeSstill-lime / crawlee-js
3y ago