Apify Discord Mirror

Updated 2 years ago

PlaywrightCrawler.requestHandler: Error: mouse.move: Target page, context or browser has been closed

At a glance

The community member is experiencing issues with their Playwright-based web crawler, where they sometimes encounter errors related to the page or browser being closed when trying to move the mouse randomly. The community members discuss potential causes, such as the website detecting the bot and closing the page, or the system resources being overloaded. Some suggestions include trying Playwright as it may be less detectable, reducing the concurrency, and checking for RAM or CPU issues. However, there is no explicitly marked answer in the comments.

In the PlaywrightCrawler.requestHandler I calling page.mouse.move and
sometimes I get this error: mouse.move: Target page, context or browser has been closed

Here the sequence of calls:
Plain Text
async requestHandler( {request, response, page, enqueueLinks, log, proxyInfo} )
{
    ...
    await sleep( interval );
    await page.mouse.move( rnd(100,400), rnd(40,300) );
    await sleep( interval );
    ...
    content = await page.content();
}


In case I catch the exception thrown in page.mouse.move and contine - than I get
almost the same thing when calling page.content():
page.content: Target page, context or browser has been closed

I would like to move mouse randomly - I think, the make my scraper "human-like".
But something is going wrong here and I can not figure out what.

Sometimes this code works and sometimes I see these errors!
Pls help!

UPDATE:
and sometimes the error message is:
ERROR requestHandler: Request failed and reached maximum retries. page.goto: Navigation failed because page was closed!
I think, somehow these error messages are related...
A
L
n
4 comments
Page might detect bot and close itself, as a quick solution try Playwright (its less detectable)
page close usually means it just crashed because your CPU or memory was overloaded. Try to decrease concurrency or increase system resources
I do not think this is my case (minConcurrency=maxConcurrency=4, i do not think this is a lot) but I check this again.

CPU or memory was overloaded.
with CPU - this is clear.
with RAM - do you think this is "not enough RAM for +1 browser" or "not enough RAM for the node process"? What is the best way to detect the shortage of RAM ?
I mean the browser, it depends a lot on how big the page is. Just try to reduce concurrency to 1 and see if you still see the crashes. Also, don't set minConcurrency, because it cannot downscale then, use desiredConcurrency of autoscaledPoolOptions
Add a reply
Sign up and join the conversation on Discord