Hello Linux: Celery & Supervisor
By Justin

This is the eight post of a many part series. This post is the starter post for the whole series.
Celery is a Python package that integrates with Redis (or RabbitMQ) to help offload various functions from your core python app. This is very useful for things like generating repots, processing data, and even scheduling periodic tasks.
Supervisor is a process manager (as you likely know) that will ensure that Celery is running along side of your app.
Requirements
Be sure to complete
- Git Push Local Code to Live Linux Server
- Virtual Environment in Working Directory
- PostgreSQL on Live Linux Server
- Setup Gunicorn & Supervisor
- Nginx & UFW Firewall
- Custom Domain & Https with Let's Encrypt
- Install Redis
Our Live Server
- Ubuntu 18.04
1. Celery + Redis + Django
We've covered this in detail before in this guide in this Guide. Naturally ignore the heroku portion if you intend to do the rest of this guide.
2. SSH into your server
Replace [email protected] with your user / ip
3. Install & Start Supervisor
Assuming you did this post recently, you shouldn't need to do this.
sudo apt-get update -y
sudo apt-get install supervisor -y
sudo service supervisor start
4. Install Celery in our Working Directory's Virtualenv
Find the Virtualenv bin that you created in this post
Mine is /var/www/myproject/bin/
/var/www/myproject/bin/python -m pip install "celery[redis]" redis
Fun note; if you already installed redis/celery locally, updated requirements.txt, did a push, there's a good chance your celery is already installed.
5. Verify Celery works
/var/www/hello_linux/bin/celery -A
You should see
usage: celery [-h] [-A APP] [-b BROKER] [--result-backend RESULT_BACKEND]
[--loader LOADER] [--config CONFIG] [--workdir WORKDIR]
[--no-color] [--quiet]
celery: error: argument -A/--app: expected one argument
You should not see:
-bash: /var/www/myproject/bin/celery: No such file or directory
6. Create a Supervisor Process
Almost the same exact instructions from here
With supervisor, we can run step 4 automatically, restart it if it fails, create logs for it, and start/stop it easily.
Basically, this ensures that our web server will continue to run if you push new code, server reboots/restarts/goes down and back up, etc.
Of course, if a catastrophic error occurs (or if bad code is in your Django project) then this proceess might fail as well.
All supervisor proccesses go in:
/etc/supervisor/conf.d/
So, if you ever need to add a new process, you'll just add it there.
Let's create our project's celery configuration file for supervisor.
touch /etc/supervisor/conf.d/myproject-celery.conf
Now you should see:
$ ls -al /etc/supervisor/conf.d/
myproject-celery.conf
Now, let's add the base settings:
[program:myproject_celery]
user=root
directory=/var/www/myproject/src/
command=/var/www/myproject/bin/celery -A myproject worker -l info
autostart=true
autorestart=true
stdout_logfile=/var/log/myproject/celery.log
stderr_logfile=/var/log/myproject/celery.err.log"
Let's break it down line by line.
[program:myproject_celery]- sudo supervisorctl status myproject_celery
- sudo supervisorctl start myproject_celery
- sudo supervisorctl stop myproject_celery
- sudo supervisorctl restart myproject_celery
user=rootdirectory=/var/www/myproject/src/command=...autostartautorestartstdout_logfile7. Update Supervisor
supervisorctl reread
supervisorctl update
8. Check Our Supervisor Program/Process Status
As we mentioned when we crated the myproject_guincorn supervisor program, we can now do:
sudo supervisorctl status myproject_celery
A few other useful commands (again):
- sudo supervisorctl start myproject_celery
- sudo supervisorctl stop myproject_celery
- sudo supervisorctl restart myproject_celery
Wrap Up
We now have