living-lavenderL
Apify & Crawleeโ€ข4y agoโ€ข
5 replies
living-lavender

how to disable duplicates check

import { Dataset, HttpCrawler, log, LogLevel } from 'crawlee';
log.setLevel(LogLevel.DEBUG);
const crawler = new HttpCrawler({
    useSessionPool:false,
    persistCookiesPerSession:false,
    minConcurrency: 1,
    maxConcurrency: 5,
    maxRequestRetries: 1,
    requestHandlerTimeoutSecs: 30,
    maxRequestsPerCrawl: 10,
    async requestHandler({ request, body }) {
        log.debug(`Processing ${request.url}...`);  
        log.debug(`${body}`);      
    },
    failedRequestHandler({ request }) {
        log.debug(`Request ${request.url} failed twice.`);
    },
});
await crawler.run([
    'https://httpbin.org/ip','https://httpbin.org/ip',
]);
log.debug('Crawler finished.');


This is my current code
Was this page helpful?