moderate-tomatoM
Apify & Crawlee3y ago
138 replies
moderate-tomato

Node running out of memory

I'm scraping some e-commerce stores in a single project, and after about 30k products node crashes because it runs out of memory. Raising the amount of memory allocated to node is not a good solution, as I plan to increase the incoming data to at least 10x. The most obvious solution seems to scale horizontally and run a node instance for each e-commerce store I want to scrape.

However, is there any way to decrease the load of memory that crawlee uses? I would be happy to use streaming for exporting the datasets and the dataset items are already persisted through local files.
Was this page helpful?