fascinating-indigo•3y ago
is there a way to close browser in puppeteer crawler?
my crawler got stuck getting request timeouts with concurrency of 20, if i could close browser on request timeout that could solve the issue.

2 Replies
it supposed to be higher level logic: either manage session. max pages and retries in crawler or run crawler with batches of URLs, sounds like your crawler blocked per web visitor
You can simply throw error or return from
requestHandler
. I would reduce concurrency to 1 and try to debug where it gets stuck