NeoNomadeN
Apify & Crawlee3y ago
21 replies
NeoNomade

map maximum size exceeded

I get the following error:
 WARN  PuppeteerCrawler: Reclaiming failed request back to the list or queue. Map maximum size exceeded

The script at this point is using 11gb of ram (I've allowed 40gb of max heap size)
Last status message:
INFO  PuppeteerCrawler: Status message: Crawled 79500/16744431 pages, 0 errors.


How in the world can I overcome this issue ?
Was this page helpful?