Apify & CrawleeA&CApify & Crawlee
Powered by
spotty-amberS
Apify & Crawleeβ€’4y agoβ€’
12 replies
spotty-amber

How to recrawl initial page for new links without purging to keep track of duplicates?

Thank you for releasing Crawlee as an open source project.

I have setup a very simple CheerioCrawler that crawls the home page of a news website for starters.
I am then scraping certain articles and saving some information to an external database. I aim to run the crawler at a regular interval (every few hours) to check for new links/articles.
I want to keep track of what I've crawled in previous runs as to not re-visit those pages and waste resources (mine and the hosts), so I set CRAWLEE_PURGE_ON_START to false to keep track of what's been crawled.

Current State:
- Once I run the crawler once, the home page is marked as "handled" and never visited again on subsequent runs to look for new links within it.
Desired State:
- On each new run, crawl the same home page, and enqueue any and only new links found for handling/scaping.

Is there a way to make my starting home page (example.com) re-crawlable on each run without purging? I believe it's something I can add within the default handler, I'm just not sure what exactly it is.

// .env
CRAWLEE_PURGE_ON_START=false

// main.ts
const startUrls = ["example.com"]
const crawler = new CheerioCrawler({
  requestHandler: router,
  maxRequestsPerCrawl: 5,
  maxConcurrency: 1,
  // maxRequestsPerMinute: 30,
});
const main = async () => {
  await crawler.addRequests(startUrls);
  await crawler.run();
};
main();

// router.ts
router.addDefaultHandler(async ({ enqueueLinks, log }) => {
  log.info(`enqueueing new URLs`);
  await enqueueLinks({
    strategy: EnqueueStrategy.SameHostname,
    globs: [
      "https://example.com/news/*",
    ],
    label: "detail"
  })
});

router.addHandler("detail", async ({ request: req, $, log }) => {
  const title = await page.title();
  log.info(`Title of ${request.loadedUrl} is '${title}'`);

  // Save to DB Code here
});
// .env
CRAWLEE_PURGE_ON_START=false

// main.ts
const startUrls = ["example.com"]
const crawler = new CheerioCrawler({
  requestHandler: router,
  maxRequestsPerCrawl: 5,
  maxConcurrency: 1,
  // maxRequestsPerMinute: 30,
});
const main = async () => {
  await crawler.addRequests(startUrls);
  await crawler.run();
};
main();

// router.ts
router.addDefaultHandler(async ({ enqueueLinks, log }) => {
  log.info(`enqueueing new URLs`);
  await enqueueLinks({
    strategy: EnqueueStrategy.SameHostname,
    globs: [
      "https://example.com/news/*",
    ],
    label: "detail"
  })
});

router.addHandler("detail", async ({ request: req, $, log }) => {
  const title = await page.title();
  log.info(`Title of ${request.loadedUrl} is '${title}'`);

  // Save to DB Code here
});


Thank you.
Apify & Crawlee banner
Apify & CrawleeJoin
This is the official developer community of Apify and Crawlee.
14,091Members
Resources
Was this page helpful?

Similar Threads

Recent Announcements
Recent Announcements
ellativity

**Update to Store Publishing Terms and Acceptable Use Policy** Due to an influx of fraudulent reviews recently, Apify's Legal team has taken some actions to protect developers, customers, and Apify, by updating the Store Publishing Terms and Acceptable Use Policy. Please pay special attention to the updated terms in section 4 of the Store Publishing Terms here: https://docs.apify.com/legal/store-publishing-terms-and-conditions Additionally, please review the changes to section 2 of the Acceptable Use Policy here: https://docs.apify.com/legal/acceptable-use-policy If you have any questions, please ask them in <#1206131794261315594> so everyone can see the discussion. Thanks!

ellativity Β· 3w ago

ellativity

Hi @everyone I'm hanging out with the Creator team at Apify in https://discord.com/channels/801163717915574323/1430491198145167371 if you want to discuss Analytics and Insights!

ellativity Β· 4w ago

ellativity

2 things for <@&1092713625141137429> members today: 1. The Apify developer rewards program is open for registrations: https://apify.notion.site/developer-rewards This is the program where you will earn points for marketing activities. The rewards are still TBC, but the real purpose of the program is to help you structure your marketing activities and efforts. In the coming weeks, I will be populating that link with guides to help you identify the best ways to market your Actors, as well as scheduling workshops and office hours to help you create content and develop your own marketing strategy. 2. At 2PM CET (in about 80 minutes) there will be an office hour with the team behind Insights and Analytics, who want your feedback on how to improve analytics for you. Join us in https://discord.com/channels/801163717915574323/1430491198145167371 to share your ideas!

ellativity Β· 4w ago

Similar Threads

Go to solution to prevent recrawl?
skinny-azureSskinny-azure / crawlee-js
2y ago
how to disable duplicates check
living-lavenderLliving-lavender / crawlee-js
4y ago
Keeping track of the parent page with PlaywrightCrawler
embarrassing-maroonEembarrassing-maroon / crawlee-js
3y ago
How to click on pagination of js links?
hurt-tomatoHhurt-tomato / crawlee-js
4y ago