cryptorexC
Apify & Crawleeโ€ข2w agoโ€ข
4 replies
cryptorex

Skipping URLs in the handler for https://domain.com/page.html due to the enqueueLinks limit of 0.

๐ŸŽญPlaywrightCrawler
As title says, not sure why we're receiving this message now, I don't want ANY urls skipped and there doesn't seem to be a way to make sure they do get re-queued? Is this benign internal tracking messaging?

I upgraded from 3.13.0 -> 3.16.0 and noticed this behavior. In researching it seems like its tied to
maxRequestsPerCrawl
but this setting is to prevent infinite looping of certain sites. What am I missing?

Current setting is
maxRequestsPerCrawl: 20000


Thanks.
Was this page helpful?