Session Cookies
When i use bs4 crawler i would expect that session stores received cookies and then reuses in next requests but SessionCookies are empty. Any idea why and how to make it work?
SessionCookies are tied to a specific Session.SessionCookies in another Session when you check this.SessionCookiesSessionCookiesasync def test():
crawler = BeautifulSoupCrawler()
@crawler.router.default_handler
async def set_cookies(ctx: BeautifulSoupCrawlingContext):
ctx.log.info(f'Cookies set to {ctx.session.id}')
await ctx.add_requests([Request.from_url('https://httpbin.org/cookies', label='GET', session_id=ctx.session.id)])
@crawler.router.handler('GET')
async def get_cookies(ctx: BeautifulSoupCrawlingContext):
ctx.log.info(f'Cookies retrieved from {ctx.session.id}')
print(ctx.session.cookies)
print(ctx.http_response.read())
await crawler.run(['https://httpbin.org/cookies/set/a/1'])