fascinating-indigo
fascinating-indigo3y ago

is there a way to close browser in puppeteer crawler?

my crawler got stuck getting request timeouts with concurrency of 20, if i could close browser on request timeout that could solve the issue.
No description
2 Replies
Alexey Udovydchenko
it supposed to be higher level logic: either manage session. max pages and retries in crawler or run crawler with batches of URLs, sounds like your crawler blocked per web visitor
Lukas Krivka
Lukas Krivka3y ago
You can simply throw error or return from requestHandler. I would reduce concurrency to 1 and try to debug where it gets stuck

Did you find this page helpful?