Saving the working configurations & Sessions for each sites
I'm new to Crawlee, I'm super excited to migrate my scraping architecture to Crawlee but I can't find how to achieve this.
My use case:
I'm scraping 100 websites multiple times a day. I'd like to save the working configurations (cookies, headers, proxy) for each site.
From what I understand, Session are made for this.
However, I'd like to have the working Sessions in my database: this way working sessions persists even if the script shutdown...
Also, saving the working configurations in a database would be useful when scaling Crawlee to multiple server instances.
My ideal scenario would be to save all the configurations for each sites (including the type of crawler used (cheerio, got, playwright), css selectors, proxy needs, headers, cookies...)
Thanks a lot for your help!
