How is it possible to configure celery to send email alerts when tasks are failing?
For example I want Celery to notify me when more than 3 tasks fail or more than 10 tasks are being retried.
Is it possible using celery or a utility (e.g. flower) or I have to write my own plugin?
Yes, all you need to do is set CELERY_SEND_TASK_ERROR_EMAILS = True and if Celery process fails django will send message with traceback to all emails set in ADMINS settings.
As far as I know, it's not possible out of the box.
You could write custom client on top of celery or flower or directly accessing RabbitMQ.
What I would do (and I am doing) is simply logging failed tasks and then use something like Graylog2 to monitor the log files, this works for all your infrastructure, not just Celery.
You can also use something like NewRelic which monitors your processes directly and offers many other features. Although email reporting on exceptions is somewhat limited in NewRelic.
A simple client/monitor probably is the quickest solution.
Related
I have django celery up and running.
Do I have something else in order to activate https://docs.celeryproject.org/en/master/userguide/configuration.html#std:setting-result_expires or it works already?
I don't have celery beat installed.
No. You do not need Celery beat for that. Result expiry is handled internally by Celery and/or the backend you use in your project.
However, keep in mind this:
Note
For the moment this only works with the AMQP, database, cache, Couchbase, and Redis backends.
When using the database backend, celery beat must be running for the results to be expired.
we know we have some badly formed tasks in the celery queue that are crashing our workers, is there an easy way to manually delete them?
We don't want to flush the whole thing as there might be some important emails to be sent...
Use celery flower to manage and monitor the celery task. Refer to https://flower.readthedocs.io/en/latest/
Is where some flower analog which work with sqs+celery somehow? I found out that in current celery version sqs doesnt support celery events. Maybe where is some other way to get information about how many tasks success, how many crashed etc.
My project use Django+Celery+SQS
ClouWatch can give you some monitoring information:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/sqs-metricscollected.html
I am continuing Django project of someone who is using Celery along with Mandrill. There are daily reports which are sent to customers and due to some reason not a single mail is sent for three days, gets accumulated and sent together after three days. Since I am new to Celery, I want to know how to debug celery delays and errors, what are popular commands and execution path to follow?
Short tips:
Set debug=True in celery config, it will take you register and execution time for every task.
Install flower, popular tool for monitoring tasks
Use sentry for handy error tracking and aggregation
Happy debugging ;)
I know that it's currently possible to use Django Celery to schedule tasks using Django's built-in ORM, but is there a way to use MongoDB in this regard?
(I'm not asking about brokers or result backends, as I know that Celery supports those, I'm specifically asking about scheduling.)
I think what you're looking for is celerybeat-mongo
Yes it posible. I just use worker -B arguments for my workers.
I do not sure that celery scheduler put this tasks in queue (mongodb in this case), because it already runing on backend. But you always can trigger (delay) task inside schedule task.