MikmokkiM
Apify & Crawlee8mo ago
3 replies
Mikmokki

Crawler always stops after exactly 300 seconds

I use crawlee python in docker and it always stops after exactly 300 seconds. I checked that it gets asyncio. CancelledError in AutoscaledPool.run() method but I don't know what sends it. If I try some simple python example the keep_alive works but in my dockerized system it just always sends final statistics after 300 seconds and stops. I checked that it happens with multiple different crawler types
Solution
Nevermind it was arq job_timeout
Was this page helpful?