Apify & CrawleeA&CApify & Crawlee
Powered by
worthy-azureW
Apify & Crawlee•4y ago•
6 replies
worthy-azure

How to share object between requests with Crawlee on Apify

Hello. While scraping website, I need an access object, which will be shared between all requests. I keep some data in this object, every request can read/write there. When all requests are handled, I do some validation and calculations on the data and write the result to Dataset.

It was easy in Apify SDKv2. I created instance of the object and passed it as parameter of handleXY methods. Like this:
const myData = new MyData();

const crawlerOptions = {
  handlePageFunction: async (context) => {
      switch (context.request.userData.type) {
        case "pageA": await handleBranch(myData); break;
        default: await handleStart(myData);
      }
    },
};

const crawler = new Apify.PuppeteerCrawler(crawlerOptions);
await crawler.run();
await Apify.pushData(myData.getData());
const myData = new MyData();

const crawlerOptions = {
  handlePageFunction: async (context) => {
      switch (context.request.userData.type) {
        case "pageA": await handleBranch(myData); break;
        default: await handleStart(myData);
      }
    },
};

const crawler = new Apify.PuppeteerCrawler(crawlerOptions);
await crawler.run();
await Apify.pushData(myData.getData());


This works without any problems. I need to achieve the same behavior with Crawlee and I want to use routing. Since I can't pass any parameters to handlers, I create instance of
myData
myData
, set this instance to
crawler
crawler
and then read it from it. Like this:

// main.js
const crawler = new PuppeteerCrawler();
crawler.myData = new MyData();

// routes.js
router.addDefaultHandler(async ({ crawler }) => {
  const myData = crawler.myData;
}
// main.js
const crawler = new PuppeteerCrawler();
crawler.myData = new MyData();

// routes.js
router.addDefaultHandler(async ({ crawler }) => {
  const myData = crawler.myData;
}


However, I found, that sometimes the task is restarted somehow. It handles some requests and then new Docker instance is created and this handles rest of requests. When this new instance is created, I lost instance of
myData
myData
.

2022-10-11T12:56:01.157Z INFO  Request N
2022-10-11T12:56:20.894Z ACTOR: Pulling Docker image from repository.
2022-10-11T12:56:42.031Z ACTOR: Creating Docker container.
2022-10-11T12:56:42.303Z ACTOR: Starting Docker container.
2022-10-11T12:56:54.251Z INFO Request N + 1
2022-10-11T12:56:01.157Z INFO  Request N
2022-10-11T12:56:20.894Z ACTOR: Pulling Docker image from repository.
2022-10-11T12:56:42.031Z ACTOR: Creating Docker container.
2022-10-11T12:56:42.303Z ACTOR: Starting Docker container.
2022-10-11T12:56:54.251Z INFO Request N + 1


How to solve this issue? Do I have to serialize this object to DataSet/KeyValueStore? What about parallel request? The best solution for me would be to keep all request in one Docker instance. Is it possible somehow?
Apify & Crawlee banner
Apify & CrawleeJoin
This is the official developer community of Apify and Crawlee.
13,923Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
ellativity

**Update to Store Publishing Terms and Acceptable Use Policy** Due to an influx of fraudulent reviews recently, Apify's Legal team has taken some actions to protect developers, customers, and Apify, by updating the Store Publishing Terms and Acceptable Use Policy. Please pay special attention to the updated terms in section 4 of the Store Publishing Terms here: https://docs.apify.com/legal/store-publishing-terms-and-conditions Additionally, please review the changes to section 2 of the Acceptable Use Policy here: https://docs.apify.com/legal/acceptable-use-policy If you have any questions, please ask them in <#1206131794261315594> so everyone can see the discussion. Thanks!

ellativity · 3w ago

ellativity

Hi @everyone I'm hanging out with the Creator team at Apify in https://discord.com/channels/801163717915574323/1430491198145167371 if you want to discuss Analytics and Insights!

ellativity · 4w ago

ellativity

2 things for <@&1092713625141137429> members today: 1. The Apify developer rewards program is open for registrations: https://apify.notion.site/developer-rewards This is the program where you will earn points for marketing activities. The rewards are still TBC, but the real purpose of the program is to help you structure your marketing activities and efforts. In the coming weeks, I will be populating that link with guides to help you identify the best ways to market your Actors, as well as scheduling workshops and office hours to help you create content and develop your own marketing strategy. 2. At 2PM CET (in about 80 minutes) there will be an office hour with the team behind Insights and Analytics, who want your feedback on how to improve analytics for you. Join us in https://discord.com/channels/801163717915574323/1430491198145167371 to share your ideas!

ellativity · 4w ago

Similar Threads

Share cache between multiple crawlee instances
awake-maroonAawake-maroon / crawlee-js
4y ago
Deploying crawlee project to Apify
resonant-amethystRresonant-amethyst / crawlee-js
2y ago
Crawlee/Apify usage with chrome extensions
faint-whiteFfaint-white / crawlee-js
4y ago
Blocking network requests with crawlee PuppeteerCrawler
living-lavenderLliving-lavender / crawlee-js
2y ago