got many 429 status code when crawled the target site,even though proxies. How to optimise my code?
got many 429 status code when crawled the target site,even though proxies. How to optimise my code?
At a glance
A community member is a Python coder who is not proficient in Node.js. They have created a crawler using the Crawlee library, with options such as useSessionPool: true, useIncognitoPages: true, and using Residential Proxies. However, they have found that the IP addresses of some pages are the same as others, and they want to launch a different browser context for each target URL, but they don't know how to do that.
In the comments, another community member suggests setting maxErrorScore to 1 in the SessionOptions interface, which will assign a new proxy from the proxy configuration after each error.
The original poster thanks the community member for the suggestion and says they will test the method.
Hi,guys. I am a python coder but not good at nodejs. I make a crawler to bulk check the infomation by the crawlee.
This is my option : useSessionPool: true, useIncognitoPages: true,
and I using the Residential Proxies . But I found that the ip of some pages as same as the others. I want launch diffrent browser context in each target url . But I don't know how to do that. Anybody could help me ?