Python celery flower no monitor tab (no real-time grohics) - celery

How I can see a real-time tasks execution?
I'm novice in celery. I'm running celery but don't see a monitor tab
http://127.0.0.1:5555/monitor returns "page not found"
Here is how I'm starting it
Celery in the script
if __name__ == '__main__':
REDIS_TASKS_BROKER = 'redis://localhost:6379/0'
REDIS_TASKS_BACKEND = 'redis://localhost:6379/1
app = Celery(main=__name__, broker=REDIS_TASKS_BROKER, backend=REDIS_TASKS_BACKEND)
worker = app.Worker()
worker.start()
Flower in the terminal (after running a script):
celery -A tasks --broker=redis://localhost:6379/0 flower --port=5555

See https://github.com/mher/flower/issues/1107. The monitor tab was deprecated with version v1 and replaced with Prometheus stats. You can either downgrade to v0.9.7 (but beware that it's not compatible with Celery v5) or follow this guide to setup the Prometheus integration.

Related

Purge Celery tasks on GCP K8s

I want to purge Celery tasks. Celery is running on GCP Kubernetes in my case. Is there a way to do it from terminal? For example via kubectl?
The solution I found was to write to file shared by both API and
Celery containers. In this file, whenever an interruption is captured,
a flag is set to true. Inside the celery containers I keep
periodically checking the contents of such file. If the flag is set to
true, then I gracefully clear things up and raise an error.
Does this solve your problem? How can I properly kill a celery task in a kubernetes environment?
an alternate solution may be:
$ celery -A proj purge
or
from proj.celery import app
app.control.purge()

Set Celery worker logging level

In version 4.1.0 of Celery, there was a --loglevel flag which set the log level of the Celery worker.
This worked for things like celery -A myapp worker --loglevel INFO.
But, as of version 5.0.2, this flag has been removed from the documentation.
As of right now, if I Google "Celery worker set log level" I get links to the Celery source code, and to this SO question which assumes its existence.
So how do you set the log level of a Celery worker now?
Although this is no longer in the documentation, --loglevel is still a valid parameter to worker in Celery 5.0.2.
celery --app my_app worker --loglevel INFO

Celery Flower Broker Tab not populating with broker_api set for rabbitmq api

I'm trying to populate the Broker tab on Celery Flower but when I pass a broker_api like the following example:
python manage.py celery flower --broker_api=http://guest:guest#localhost:15672/api/
I get the following error:
state.py:108 (run) Failed to inspect the broker: 'list' object is not callable
I'm confident the credentials I'm using are correct and the RabbitMQ Management Plugin is enabled. I'm able to access the RabbitMQ monitoring page through the browser.
flower==0.6.0
RabbitMQ 3.2.1
Does anyone know how to fix this?
Try removing the slash after /api/:
python manage.py celery flower --broker_api=http://guest:guest#localhost:15672/api
Had the same issue on an Airflow setup with Celery 5.2.6 and Flower 1.0.0. The solution for me was to launch Flower using:
airflow celery flower --broker-api=http://guest:guest#rabbitmq:15672/api/
For non-Airflow readers, I believe the command should be:
celery flower --broker=amqp://guest:guest#rabbitmq:5672 --broker_api=http://guest:guest#rabbitmq:15672/api/
A few remarks:
The above assumes a shared Docker network. If that's not the case, every #rabbitmq should be replaced with e.g. #localhost
--broker is not needed if running under Airflow's umbrella (it's passed from the Airflow config)
A good test to verify the API works is to access http://guest:guest#localhost:15672/api/index.html locally

cannot run celery subtasks if queues are specified

I am running celery 3.0.19 with mongodb as backend and broker. I would like to use queues in sub-tasks but it does not work. Here is how to reproduce the problem from the example add task.
Start a celery worker with the command
celery -A tasks worker --loglevel=info --queue=foo
Then create a task that never gets done like that
from tasks import add
sub_task = add.s(queue="foo")
sub_task_async_result = sub_task.apply_async((2,2))
note the following task will get executed normally.
async_result = add.apply_async((2,2), queue="foo")
what do I do wrong?
Thanks!

django celery upgrade from 2.5 to 3.0.19 - tasks pending

I'm trying to upgrade our celery from 2.5 to 3.0.19 (using django-celery) but have this strange issue.
I updated the /etc/default files and used the new /etc/init.d/celeryd scripts, I see that the celery workers (as well as celerybeat) are running fine.
However, the tasks I'm launching using delay seem to always stay in PENDING state, and my celery workers don't appear to receive any new tasks.
RabbitMQ is running, and I've updated the BROKER_URL and it looks correct. I'm not getting any errors otherwise.
Any ideas what to look for or how to debug this?