How to share object between requests with Crawlee on Apify
It was easy in Apify SDKv2. I created instance of the object and passed it as parameter of handleXY methods. Like this:
This works without any problems. I need to achieve the same behavior with Crawlee and I want to use routing. Since I can't pass any parameters to handlers, I create instance of
myData, set this instance to crawler and then read it from it. Like this:However, I found, that sometimes the task is restarted somehow. It handles some requests and then new Docker instance is created and this handles rest of requests. When this new instance is created, I lost instance of
myData. How to solve this issue? Do I have to serialize this object to DataSet/KeyValueStore? What about parallel request? The best solution for me would be to keep all request in one Docker instance. Is it possible somehow?
