celery ImportError: No module named 'tasks' - celery

I am trying to learn how to use celery to later integrate into my flask app. I am just trying to execute the basic example found on the Celery Docs I have created a file called task.py and from within that folder where the file task.py is existing i am running celery -A tasks worker --loglevel=info but it is giving an error. I can't seem to figure out what is wrong.
from celery import Celery
app = Celery('tasks', broker='amqp://localhost')
#app.task
def add(x, y):
return x + y
error I am seeing
celery -A tasks worker --loglevel=info
ImportError: No module named 'tasks'

Try executing the command from application folder level. If your tasks.py is inside flask_app/configs/tasks.py, then run the following command from inside flask_app folder.
celery worker --app=configs.tasks:app --loglevel=info
if you want to daemonize celery use following command
celery multi start worker --app=configs.tasks:app --loglevel=info
** multi start will daemonize the celery,
and be sure to activate virtualenv before running the command, if the application is running inside one.
I am successfully running celery in django with django-celery, had faced the same issue.

Related

Check celery config with command line

I have celery running in a docker container and I want to check that the option CELERY_TASK_RESULT_EXPIRES = '3600' has been applied.
I tried using celery inspect conf and celery inspect stats but the commands never end. Other than that celery is running fine and doing its work.
You can get that from celery inspect. Try this
celery -A app inspect report --timeout 10
Found flower. It is installed with
pip install flower
flower -A celery-app-name --port=5555
And then celery can be accessed via REST API. The following will give the workers config
curl -w "\n" http://localhost:5555/api/workers

Error: Unable to load celery application. The module main was not found. Supervisor + celery

I can’t start a bunch supervisor and celery. Because celery does not see my module app.
/etc/supervisor/conf.d/celery.conf
[program:celery]
command=/home/ubuntu/django/.env/bin/celery -A main worker --app=main --loglevel=info
user=root
stdout_logfile=/home/ubuntu/django/deployment/logs/celery.log
stderr_logfile=/home/ubuntu/django/deployment/logs/celery_main.log
autostart=true
autorestart=true
startsecs=10
stopwaitsecs = 600
-django
--.env
--main
---settings.py
---celery.py
...
--orders
--shop
if I run this command in a virtual environment in my project directory everything works fine. But if I want to do it at a distance I can not, Why? In my logs celery says Error: Unable to load celery application. The module main was not found.
What I don't see in your configuration file is the working directory, that could explain why the celery command can not find the module, but it is working when you run it manually.
Try adding:
directory=/home/ubuntu/django
to your configuration file and see if this will fix the error.

Limit number of processes in Celery with supervisor

I'm running Celery in a small instance in AWS Elastic Beanstalk.
However, when I do top, I see there are 3 celery processes running. I want to have only.
I'm running this using supervisor and in my config file I have (only showing relevant lines):
[program:celeryd]
directory=/opt/python/current/app/src
command=/opt/python/run/venv/bin/celery worker -A ..."
user=celery
numprocs=1
killasgroup=true
I've also followed the suggestion in this answer and created a file /etc/default/celeryd with this content:
# Extra arguments to celeryd
CELERYD_OPTS="--concurrency=1"
After restarting Celery (with supervisorctl -c config-file-path.conf restart celeryd), I see the 3 processes again. Any ideas? Thanks!
You are starting worker with celery command. Changing /etc/default/celeryd won't have any effect on celery command. Moreover celeryd is deprecated.
When a worker is started, celery launches a default process and n(concurrency) subprocesses.
You can start the worker with
[program:celery]
command=/opt/python/run/venv/bin/celery worker -c 1 -A foo"
This will start a worker with concurrency of 1 and there will be 2 processes.

Supervisord can't stop celery, how to do the same using Monit

I can't stop my celery worker using Supervisord, in the config file, it looks like this:
command=/usr/local/myapp/src/manage.py celery worker --concurrency=1 --loglevel=INFO
and when I try to stop it using the following command:
sudo service supervisord stop
It shows that the worker has stopped, while it is not.
One more problem, when you restart a program outside supervisord scope, it totally loses control over that program, because of the parent-child relationship between supervisord and its child processes
My question is: how to run celery workers using Monit?

Working example of celery with mongo DB

I'm new to celery, and am working on running asynchronous tasks using Celery.
I want to save the results of my tasks to MongoDB.
I want to use the AMQP broker.
Celery project examples didn't help me much. Can anyone point me to some working examples?
To use MongoDB as your backend store you have to explicitly configure Celery to use MongoDB as the backend.
http://docs.celeryproject.org/en/latest/getting-started/brokers/mongodb.html#broker-mongodb
As you said the documentation does not show a complete working example. I just started playing with Celery but have been using MongoDB. I created a short working tutorial using MongoDB and Celery http://skillachie.com/?p=953
However these snippets should contain all you need to get a hello world going with Celery and MongoDB
celeryconfig.py
from celery.schedules import crontab
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
"host": "127.0.0.1",
"port": 27017,
"database": "jobs",
"taskmeta_collection": "stock_taskmeta_collection",
}
#used to schedule tasks periodically and passing optional arguments
#Can be very useful. Celery does not seem to support scheduled task but only periodic
CELERYBEAT_SCHEDULE = {
'every-minute': {
'task': 'tasks.add',
'schedule': crontab(minute='*/1'),
'args': (1,2),
},
}
tasks.py
from celery import Celery
import time
#Specify mongodb host and datababse to connect to
BROKER_URL = 'mongodb://localhost:27017/jobs'
celery = Celery('EOD_TASKS',broker=BROKER_URL)
#Loads settings for Backend to store results of jobs
celery.config_from_object('celeryconfig')
#celery.task
def add(x, y):
time.sleep(30)
return x + y
I have been testing RabbitMQ as a broker and MongoDB as backend, and MongoDB as both broker and backend. These are my findings. I hope they help someone out there.
Assumption: You have MongoDB running on default settings(localhost:21017)
Setting Environment using conda(you can use whatever package manager)
conda update -n base conda -c anaconda
conda create -n apps python=3.6 pymongo
conda install -n apps -c conda-forge celery
conda activate apps
Update my conda, create an environment called apps and install pymongo and celery.
RabbitMQ as a broker and MongoDB as backend
sudo apt install rabbitmq-server
sudo service rabbitmq-server restart
sudo rabbitmqctl status
If no errors then rabbitmq out to be running. Lets create tasks in executor.py and call them in runner.py
# executor.py
import time
from celery import Celery
BROKER_URL = 'amqp://localhost//'
BACKEND_URL = 'mongodb://localhost:27017/from_celery'
app = Celery('executor', broker=BROKER_URL, backend=BACKEND_URL)
#app.task
def pizza_bot(string:str, snooze=10):
'''return a dictionary with bot and
lower case string input
'''
print(f'Pretending to be working {snooze} seconds')
time.sleep(snooze)
return {'bot':string.lower()}
and we call them in runner.py
# runner.py
import time
from datetime import datetime
from executor import pizza_bot
def run_pizza(msg:str, use_celery:bool=True):
start_time = datetime.now()
if use_celery: # Using celery
response = pizza_bot.delay(msg)
else: # Not using celery
response = pizza_bot(msg)
print(f'It took {datetime.now()-start_time}!'
' to run')
print(f'response: {response}')
return response
if __name__ == '__main__':
# Call using celery
response = run_pizza('This finishes extra fast')
while not response.ready():
print(f'[Waiting] It is {response.ready()} that we have results')
time.sleep(2) # sleep to second
print('\n We got results:')
print(response.result)
Run celery on terminal A:
cd path_to_our_python_files
celery -A executor.app worker --loglevel=info
This is done in development only. I wanted to see what was happening in the background. In production, run it in daemonize.
Run runner.py on terminal B:
cd path_to_our_python_files
conda activate apps
python runner.py
In terminal A, you will see that the task is received and in snooze seconds it will be completed. On your MongoDB, you will see a new collection called from_celery, with the message and results.
MongoDB as both broker and backend
A simple modification was needed to set this. As mentioned, I had to create a config file to set MongoDB backend settings.
#mongo_config.py
#Backend Settings
CELERY_RESULT_BACKEND = "mongodb"
CELERY_MONGODB_BACKEND_SETTINGS = {
"host": "localhost",
"port": 27017,
"database": "celery",
"taskmeta_collection": "pizza_collection",
}
Let's create executor_updated.py which is pretty much the same as executor.py but the broker is now MongoDB and backend is added via config_from_object
# executor_updated.py
import time
from celery import Celery
BROKER_URL = 'mongodb://localhost:27017/celery'
app = Celery('executor_updated',broker=BROKER_URL)
#Load Backend Settings
app.config_from_object('mongo_config')
#app.task
def pizza_bot(string:str, snooze=10):
'''return a dictionary with bot and
lower case string input
'''
print(f'Pretending to be working {snooze} seconds')
time.sleep(snooze)
return {'bot':string.lower()}
Run celery on terminal C:
cd path_to_our_python_files
celery -A executor_updated.app worker --loglevel=info
Run runner.py on terminal D:
cd path_to_our_python_files
conda activate apps
python runner.py
Now we have both MongoDB as broker and backend. In MongoDB, you will see a collection called celery and a table pizza_collection
Hope this helps in getting you started with these awesome tools.