I can schedule an hourly task in my Django app using celery beat in settings.py like so:
CELERYBEAT_SCHEDULE={
'tasks.my_task':{
'task':'tasks.my_task',
'schedule':timedelta(seconds=60*60),
'args':(),
},
}
But is there a way to schedule a task such that it immediately queues up and is calculated, thereafter following the configured schedule from there on? E.g., something like executing a selected task instantly at celery launch. What's the configuration for that?
Add the following to tasks.py:
obj = locals()['task_function_name']
obj.run()
This ensures the specified task is run when celery is run. Thereafter, it executes according to schedule.
Related
I am using python + flask + SQS and I'm also using celery beat to execute some scheduled tasks.
Recently I went from having one single default "celery" queue to execute all my tasks to having dedicated queues/workers for each task. This includes tasks scheduled by celery beat which now all go to a queue named "scheduler".
Before dropping the "celery" queue, I monitored it to see if any tasks would wind up in that queue. To my surprise, they did.
Since I had no worker consuming from that queue, I could easily inspect the messages which piled up using the AWS console. What is saw was that all tasks were celery.backend_cleanup!!!
I cannot find out from the celery docs how do I prevent this celery.backend_cleanup from getting tossed into this default "celery" queue which I want to get rid of! And the docs on beat do not show an option to pass a queue name. So how do I do this?
This is how I am starting celery beat:
/venv/bin/celery -A backend.app.celery beat -l info --pidfile=
And this is how I am starting the worker
/venv/bin/celery -A backend.app.celery worker -l info -c 2 -Ofair -Q scheduler
Keep in mind, I don't want to stop backend_cleanup from executing, I just want it to go in whatever queue I specify.
Thanks ahead for the assistance!
You can override this in the beat task setup. You could also change the scheduled time to run here if you wanted to.
app.conf.beat_schedule = {
'backend_cleanup': {
'task': 'celery.backend_cleanup',
'options': {'queue': <name>,
'exchange': <name>,
'routing_key': <name>}
}
}
I have a periodic task that uses a crontab to run every day at 1:01 AM using
run_every = crontab(hour=1, minute=1)
Once I get my server up and running, is that enough to trigger the task to run once a day? Or do I also need to use a database scheduler?
Yes. It should be enough as Celery beat has own state file that is enough to run everything as you require.
I want to dynamically add tasks to run at a particular time (clocked task). I am using django celery beat. The problem I am facing is that the celery only executes one task and ignores the rest.
I have tried the following code and searched in library that the django-celery-beat disables the schedule once it has executed the clocked task. This might be the reason for other/next task not running.
What am I doing wrong? and
What can be the alternative way to schedule multiple tasks to run at same time?
clocked, _ = ClockedSchedule.objects.get_or_create(
clocked_time=next_run_time
)
PeriodicTask.objects.create(
clocked=clocked,
name=guid1,
one_off=True,
task="schedulerapp.jobscheduler.runEvent",
args=json.dumps([guid1])
)
PeriodicTask.objects.create(
clocked=clocked,
name=guid2,
one_off=True,
task="schedulerapp.jobscheduler.runEvent",
args=json.dumps([guid2])
)
This should work.
from django_celery_beat.models import PeriodicTask, IntervalSchedule
schedule = IntervalSchedule.objects.create(every=10, period=IntervalSchedule.SECONDS)
task = PeriodicTask.objects.create(interval=schedule, name=guid1, task='schedulerapp.jobscheduler.runEvent', args=json.dumps([guid1]))
We're using Celery 4.2.1 and Redis with global soft and hard timeouts set for our tasks. All of our custom tasks are designed to stay under the limits, but every day the builtin task backend_cleanup task ends up forcibly killed by the timeouts.
I'd rather not have to raise our global timeout just to accommodate builtin Celery tasks. Is there a way to set the timeout of these builtin tasks directly?
I've had trouble finding any documentation on this or even anyone hitting the same problem.
Relevant source from celery/app/builtins.py:
#connect_on_app_finalize
def add_backend_cleanup_task(app):
"""Task used to clean up expired results.
If the configured backend requires periodic cleanup this task is also
automatically configured to run every day at 4am (requires
:program:`celery beat` to be running).
"""
#app.task(name='celery.backend_cleanup', shared=False, lazy=False)
def backend_cleanup():
app.backend.cleanup()
return backend_cleanup
You may set backend cleanup schedule directly in celery.py.
app.conf.beat_schedule = {
'backend_cleanup': {
'task': 'celery.backend_cleanup',
'schedule': 600, # 10 minutes
},
}
And then run the beat celery process:
celery -A YOUR_APP_NAME beat -l info --detach
I am running celery 3.0.19 with mongodb as backend and broker. I would like to use queues in sub-tasks but it does not work. Here is how to reproduce the problem from the example add task.
Start a celery worker with the command
celery -A tasks worker --loglevel=info --queue=foo
Then create a task that never gets done like that
from tasks import add
sub_task = add.s(queue="foo")
sub_task_async_result = sub_task.apply_async((2,2))
note the following task will get executed normally.
async_result = add.apply_async((2,2), queue="foo")
what do I do wrong?
Thanks!