Apify & CrawleeA&CApify & Crawlee
Powered by
nehalistN
Apify & Crawlee•2mo ago•
3 replies
nehalist

Running the same crawler in paralell

👨‍💻Web-Scraping
I've got a config file like

jobs:
  - name: "amazon.de"
    enabled: true

    crawler:
      id: "test"
      enabled: true
      config:
        urls:
          - "https://example.com"

  - name: "amazon.fr"
    enabled: true

    crawler:
      id: "test"
      enabled: true
      config:
        urls:
          - "https://example.com"
jobs:
  - name: "amazon.de"
    enabled: true

    crawler:
      id: "test"
      enabled: true
      config:
        urls:
          - "https://example.com"

  - name: "amazon.fr"
    enabled: true

    crawler:
      id: "test"
      enabled: true
      config:
        urls:
          - "https://example.com"


this config is processed via p-queue and, depending on the crawler id, I want to run a specific crawler, e.g.:

export const testCrawler = createCrawler({
  id: "test",

  configSchema: z.object({
    urls: z.array(z.string()),
  }),

  handler: async ({ urls }) => {
    if (!crawler) {
      crawler = new PlaywrightCrawler({
        async requestHandler({ request, log }) {
          log.info(`Processing: ${request.url}`);
        },
      });
    }
    await crawler.run(urls);
  },
});
export const testCrawler = createCrawler({
  id: "test",

  configSchema: z.object({
    urls: z.array(z.string()),
  }),

  handler: async ({ urls }) => {
    if (!crawler) {
      crawler = new PlaywrightCrawler({
        async requestHandler({ request, log }) {
          log.info(`Processing: ${request.url}`);
        },
      });
    }
    await crawler.run(urls);
  },
});


different sites might use the crawler but a different
requestHandler
requestHandler
. currently when running this I get

This crawler instance is already running, you can add more requests to it via
crawler.addRequests()
crawler.addRequests()


so it's not possible to spawn multiple crawlers of the same type at the same time? would kinda mess up my mental model (and the current impl) a bit. if so, I guess I need to "collect" all data before running the actual crawler? since different crawler "definitions" (e.g.
testCrawler
testCrawler
) require different configurations, this could get messy
Solution
Hey, there is a slight problem, your code first asks whether there already is a crawler
if (!crawler)
if (!crawler)
, if not it creates one, and then if thre already is one, it still calls
await crawler.run(urls)
await crawler.run(urls)
, that is the issue - you can have multiple crawlers, but you can't have multiple
crawler.run
crawler.run
at the same time.
Jump to solution
Apify & Crawlee banner
Apify & CrawleeJoin
This is the official developer community of Apify and Crawlee.
13,923Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
ellativity

**Update to Store Publishing Terms and Acceptable Use Policy** Due to an influx of fraudulent reviews recently, Apify's Legal team has taken some actions to protect developers, customers, and Apify, by updating the Store Publishing Terms and Acceptable Use Policy. Please pay special attention to the updated terms in section 4 of the Store Publishing Terms here: https://docs.apify.com/legal/store-publishing-terms-and-conditions Additionally, please review the changes to section 2 of the Acceptable Use Policy here: https://docs.apify.com/legal/acceptable-use-policy If you have any questions, please ask them in <#1206131794261315594> so everyone can see the discussion. Thanks!

ellativity · 3w ago

ellativity

Hi @everyone I'm hanging out with the Creator team at Apify in https://discord.com/channels/801163717915574323/1430491198145167371 if you want to discuss Analytics and Insights!

ellativity · 4w ago

ellativity

2 things for <@&1092713625141137429> members today: 1. The Apify developer rewards program is open for registrations: https://apify.notion.site/developer-rewards This is the program where you will earn points for marketing activities. The rewards are still TBC, but the real purpose of the program is to help you structure your marketing activities and efforts. In the coming weeks, I will be populating that link with guides to help you identify the best ways to market your Actors, as well as scheduling workshops and office hours to help you create content and develop your own marketing strategy. 2. At 2PM CET (in about 80 minutes) there will be an office hour with the team behind Insights and Analytics, who want your feedback on how to improve analytics for you. Join us in https://discord.com/channels/801163717915574323/1430491198145167371 to share your ideas!

ellativity · 4w ago

Similar Threads

Running userscripts in Playwright/Puppeteer crawler
dry-scarletDdry-scarlet / crawlee-js
2y ago
How to add multiple crawler in the same desktop program? thanks
hunterleung.Hhunterleung. / crawlee-js
2y ago
running multiple crawler instances at once
embarrassing-maroonEembarrassing-maroon / crawlee-js
3y ago
Is the default request queue the same for different crawler instances?
MrSquaareMMrSquaare / crawlee-js
10mo ago