Apify RequestQueueClientAsync.update_request - 413 - Payload too large

Hi guys, I'm using the python sdk of apify.
I do have a simple scraper, which iterates through a pagination to collect some URL's.
At the current state my scraper is executing 55 requests and has 540 results.

But at the 55. page - I do get an error from the SDK.
I set the LOG_LEVEL to "debug" and gained the following information:

´´´
2024-02-05T08:37:42.905Z [apify_client] DEBUG Request unsuccessful ({"status_code": 413, "client_method": "RequestQueueClientAsync.update_request", "resource_id": "fs", "method": "PUT", "url": "http://10.0.91.16:8010/v2/request-queues/fs/requests/fs**", "attempt": 1})
2024-02-05T08:37:42.907Z [apify_client] DEBUG Status code is not retryable ({"status_code": 413, "client_method": "RequestQueueClientAsync.update_request", "resource_id": "fs", "method": "PUT", "url": "http://10.0.91.16:8010/v2/request-queues/fsrequests/fs*", "attempt": 1})
´´´
(masked urls, to not share user data)

I cant imagine what the problem is - my script is simply extracting 10 urls from the page, then pushing a new request to the queue, that's it.

Does anybody has an idea? Since I'm not executing the request, I cant change the payload.
Does anybody knows whats happening on apify logic's? The only big payload I can see in the logs is the 'userData' Attribute which is added by apify, I can see in "ApifyHttpProxyMiddleware.process_request: updated request.meta" calls.

Btw, the depth is also 55 <- maybe that's alraedy too big?

Looking forward to your answers!

Thank you very much in advance 🙂
Was this page helpful?