StopIteration() after a few minutes of workers job

~20 minutes from the start of 4 workers with 4 processes per worker or 2 workers with 1 processes per worker I got 0 Active jobs in Celery Flower and StopIteration exception.

I’m using PyPy2 5.6 x64 on Debian 8. Part of my venv:

  • Docker and gulp
  • Tomcat | Docker : Tomcat image doesn't start on docker when providing tomcat-users.xml
  • Using shared runners to access the gitlab registry
  • Running chrome inside docker container issue
  • docker login returns a 404 for artifactory
  • Docker 'WARNING: permission denied' on ubuntu
    • amqp (1.4.9)
    • billiard (3.3.0.23)
    • celery (3.1.25)
    • greenlet (0.4.10)
    • kombu (3.0.37)

    How I run Celery workers:

    #!/bin/bash
    
    celery worker -A project_name.celeryconf -Q default -n default@%h -Ofair --autoscale=4,4 --maxtasksperchild=8
    

    Celery workers are working as a microservices. I am using Docker and Rancher for scaling workers. My machine has 10 nodes.

    The StopIteration() exception content:

    ERROR:kombu.async.hub:Error in timer: StopIteration() Traceback (most recent call last): File "/usr/local/site-packages/kombu/async/hub.py", line 140, in fire_timers entry() File "/usr/local/site-packages/kombu/async/timer.py", line 64, in _call_ return self.fun(*self.args, **self.kwargs) File "/usr/local/site-packages/kombu/async/timer.py", line 132, in _reschedules return fun(*args, **kwargs) File "/usr/local/site-packages/billiard/pool.py", line 1308, in maintain_pool self._maintain_pool() File "/usr/local/site-packages/billiard/pool.py", line 1300, in _maintain_pool self._repopulate_pool(joined) File "/usr/local/site-packages/billiard/pool.py", line 1285, in _repopulate_pool self._create_worker_process(self._avail_index()) File "/usr/local/site-packages/celery/concurrency/asynpool.py", line 415, in _create_worker_process return super(AsynPool, self)._create_worker_process File "/usr/local/site-packages/billiard/pool.py", line 1102, in _create_worker_process inq, outq, synq = self.get_process_queues() File "/usr/local/site-packages/celery/concurrency/asynpool.py", line 997, in get_process_queues return next(q for q, owner in items(self._queues) StopIteration
    

    I don’t know what’s the matter.

  • Docker image wrapping for binary blobs
  • docker proxy pull windows 10
  • Running GUI in docker (no ssh, no VNC)
  • How to run a sbt project in docker by using sbt-docker or writing a docker file manually?
  • How to create a Dockerfile with Maven on a Windows box and then sent to a remote Linux server for building an image and then ran?
  • Memory snappiness docker
  • Docker will be the best open platform for developers and sysadmins to build, ship, and run distributed applications.