celery multi celery.exceptions.TimeoutError: The operation timed out

iMath redstone-cold at 163.com
Fri Sep 18 04:07:29 EDT 2020


I followed the official [Celery guide](https://docs.celeryproject.org/en/stable/getting-started/next-steps.html#in-the-background) to learn, everything works well when using `celery -A proj worker -l info` to  start the worker, however , if I run the worker in the background using `celery multi restart w1 -A proj -l info --logfile=./celery.log`, then I got 


    >>> from proj.tasks import add
    >>> add.delay().get(timeout=19)
    Traceback (most recent call last):
      File "<stdin>", line 1, in <module>
      File "/usr/local/lib/python3.8/dist-packages/celery/result.py", line 230, in get
        return self.backend.wait_for_pending(
      File "/usr/local/lib/python3.8/dist-packages/celery/backends/base.py", line 660, in wait_for_pending
        meta = self.wait_for(
      File "/usr/local/lib/python3.8/dist-packages/celery/backends/base.py", line 696, in wait_for
        raise TimeoutError('The operation timed out.')
    celery.exceptions.TimeoutError: The operation timed out.
    >>>
So what's wrong ?

Test environment :

 - Python 3.8.2 
 - celery multi v4.4.7
 - rabbitmqctl version --->    3.8.2

and Project layout ([code files](https://github.com/celery/celery/files/5231502/proj.zip)):

> proj/__init__.py
>     /celery.py
>     /tasks.py

celery.py

    from __future__ import absolute_import, unicode_literals
    
    from celery import Celery
    
    app = Celery('proj',
                 broker='amqp://localhost',
                 # backend='db+sqlite:///results.sqlite',  # The backend argument, If you don’t need results, it’s better to disable them.  Results can also be disabled for individual tasks by setting the @task(ignore_result=True) option.
                 include=['proj.tasks'])  # The include argument is a list of modules to import when the worker starts. You need to add our tasks module here so that the worker is able to find our tasks.
    app.conf.CELERY_RESULT_BACKEND = 'db+sqlite:///results.sqlite'
    
    if __name__ == '__main__':
        app.start()

tasks.py

    from __future__ import absolute_import, unicode_literals
    from .celery import app
    import datetime
    @app.task
    def add():
        return datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S")


The log info as the following , I googled , but didn't find a working solutions 
```
[2020-09-15 22:10:03,229: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2020-09-15 22:10:03,238: INFO/MainProcess] mingle: searching for neighbors
[2020-09-15 22:10:04,260: INFO/MainProcess] mingle: all alone
[2020-09-15 22:10:04,273: INFO/MainProcess] w1 at iZwz962a07bhoio77q4ewxZ ready.
[2020-09-15 22:10:10,625: ERROR/MainProcess] Received unregistered task of type 'proj.tasks.add'.
The message has been ignored and discarded.

Did you remember to import the module containing this task?
Or maybe you're using relative imports?

Please see
http://docs.celeryq.org/en/latest/internals/protocol.html
for more information.

The full contents of the message body was:
'[[], {}, {"callbacks": null, "errbacks": null, "chain": null, "chord": null}]' (77b)
Traceback (most recent call last):
  File "/usr/local/lib/python3.8/dist-packages/celery/worker/consumer/consumer.py", line 562, in on_task_received
    strategy = strategies[type_]
KeyError: 'proj.tasks.add'

```


More information about the Python-list mailing list