how to see Execution process for pt-online-schema-change with celery task - celery

I use celery task run pt-online-schema-change, pt-archiver and more.
this program in processing will print many message, how to know / see the processing message with celery task?

Related

What does celery daemonization mean?

Can I know what does celery daemonization mean? Also, I would like to start running both celery worker and celery beat using single command. Anyway to do it (one way I can think of is using supervisor module for worker and beat and then writing the starting scripts for those in a separate .sh file and running that script..any other way?)
In other terms, I can start the worker and beat process as background process manually even..right? So, daemonization in celery just runs the processes as background processes or is there anything else?

Celery Beat runs duplicate tasks

I have one celery beat task, that is running other scraping tasks.
When those tasks are not processed, queue is starting to grow.
I know celery use backend db, but there are only: id, task_id, status, result, date_done, traceback.
My ideas is to switch from celery beat to rescheduling tasks by them self, but some tasks are unconnected or can get lost, so celery beat is useful in these cases.
Second idea is to add my logs, like my table, where I can save task-id and task context, by which I will be able to find out if task already exists.
May be you have better approach? Thanks
celery tasks can be delayed with expires argument:
http://docs.celeryproject.org/en/latest/userguide/calling.html#expiration

celery beat running task on cleanup, how to stop it

I have bunch of celery beat tasks running at different times a day, but there is one particularly task at 8:00AM to send birthday messages which gets executed while beats cleanup happens at 4:00AM, so my task is running twice in a day. and I notice this happen when I restart celery beat the previous day. How to get around this and tell celery not to execute it at 4:00AM.

Celery beat fails silently

I'm having issues with a celery beat worker not sending out tasks to celery. Celery runs on three servers with a RabbitMQ cluster behind HAProxy as a backend.
Celery beat is used to schedule a task every day at 9AM. When I start the worker, usually the first task succeeds, but after that it seems like the following tasks are never sent to rabbitmq. In the celery beat log file (celery beat is run with the -l debug option), I see messages such as: Scheduler: Sending due task my-task (tasks.myTask), but no sign of the task being received by any celery worker.
I also tried logging messages in rabbitmq via the rabbitmq_tracing plugin, which only confirmed that the task never reached rabbitmq.
Any idea what could be happening? Thanks!

Number of celery tasks executed at a given point of time

I am trying to create a bunch of celery tasks asynchronously on the fly. Say there are 1000 tasks I start asynchronously and I have only one celeryd process running to execute tasks. How many threads will be created by celery to handle these tasks?
If there are multiple threads that celery starts automatically to process the task queue, how do I limit celery to execute only 100 threads at a given point of time.
Thanks.
Its starts as many as you specify with the CELERYD_OPTS concurrency parameter.
Which is also discussed here.