celery
Install celery
pip install "celery[redis]"
Add celery settings in django
In config/celery.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "projectname.settings")
app = Celery('config')
# Using a string here means the worker doesn't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
# should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')
# Load task modules from all registered Django app configs.
app.autodiscover_tasks()
# app.conf.broker_url = 'redis://localhost:6379/0'
@app.task(bind=True)
def debug_task(self):
print('Request: {0!r}'.format(self.request))
In config/init.py
from __future__ import absolute_import, unicode_literals
# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = ('celery_app',)
In settings.py
CELERY_BROKER_URL = 'redis://localhost:6379'
CELERY_RESULT_BACKEND = 'redis://localhost:6379'
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
Run redis with docker
sudo docker run -d -p 6379:6379 redis
Daemonize
https://pythad.github.io/articles/2016-12/how-to-run-celery-as-a-daemon-in-production
create /etc/init.d/celeryd
sudo vim /etc/ini.d/celeryd
Copy and paste celeryd
from repository
sudo chmod 755 /etc/init.d/celeryd
sudo chown root:root /etc/init.d/celeryd
create /etc/default/celeryd
sudo vim /etc/default/celeryd
Add example below
- change
CELERY_BIN
: your virtualenv dirCELERY_APP
: dir that hascelery.py
CELERYD_CHDIR
: dir for your projectCELERYD_USER
,CELERYD_GROUP
: your user- Add env for your project
CELERY_BIN="project/venv/bin/celery"
# App instance to use
CELERY_APP="config"
# Where to chdir at start.
CELERYD_CHDIR="/home/username/project/"
# Extra command-line arguments to the worker
CELERYD_OPTS="--time-limit=300 --concurrency=8"
# %n will be replaced with the first part of the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n%I.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
# Workers should run as an unprivileged user.
# You need to create this user manually (or you can choose
# a user/group combination that already exists (e.g., nobody).
CELERYD_USER="username"
CELERYD_GROUP="username"
# If enabled pid and log directories will be created if missing,
# and owned by the userid/group configured.
CELERY_CREATE_DIRS=1
# Add all env for your project
export SECRET_KEY="foobar"
Run celery
sudo /etc/init.d/celeryd start
sudo /etc/init.d/celeryd status
sudo /etc/init.d/celeryd stop
# see log
sudo vim /var/log/celery/worker1.log
'backend > python' 카테고리의 다른 글
Check execution time in jupyter notebook (0) | 2020.03.06 |
---|---|
poetry usage (0) | 2020.01.14 |
Installing jupyterhub (0) | 2019.02.14 |
IPython (Jupyter) notebook에서 unittest 하기 (0) | 2019.02.08 |
eyeD3: 파이썬 오디오 태그 넣기 (0) | 2019.02.05 |