put proxy at the start, or put a function to return a new proxy everytime its called. Its in the constructor of crawlee scrapper
For session:
set session pool config to have it invalidated upon a single error.
use preNavigate hook, in there put logic to do a check if context has session user data or if its signed in. If not, then we update session with a new user data and other pattern associated with the user, sign in user and attach the cookie to the context. (If theres user data it means user is signed in).
initiate session pool to be the same amount like the # of the accounts, so 1 user map to 1 session.
For behaviour:
manual customization of user behaviour by relying on the context attached to the session userData.
Tried running it many times against a single url but with different unique key. Even with max session usage : 1, a single session keeps being reused many times.