Mikmokki
Mikmokki3mo ago

Crawler always stops after exactly 300 seconds

I use crawlee python in docker and it always stops after exactly 300 seconds. I checked that it gets asyncio. CancelledError in AutoscaledPool.run() method but I don't know what sends it. If I try some simple python example the keep_alive works but in my dockerized system it just always sends final statistics after 300 seconds and stops. I checked that it happens with multiple different crawler types
Solution:
Nevermind it was arq job_timeout
Jump to solution
1 Reply
Solution
Mikmokki
Mikmokki3mo ago
Nevermind it was arq job_timeout

Did you find this page helpful?