Running the same crawler in paralell
this config is processed via p-queue and, depending on the crawler id, I want to run a specific crawler, e.g.:
different sites might use the crawler but a different
requestHandler. currently when running this I getThis crawler instance is already running, you can add more requests to it viacrawler.addRequests()
so it's not possible to spawn multiple crawlers of the same type at the same time? would kinda mess up my mental model (and the current impl) a bit. if so, I guess I need to "collect" all data before running the actual crawler? since different crawler "definitions" (e.g.
testCrawler) require different configurations, this could get messyif (!crawler), if not it creates one, and then if thre already is one, it still calls await crawler.run(urls), that is the issue - you can have multiple crawlers, but you can't have multiple crawler.run at the same time.