環境說明
1. 依賴
python 2.7
django==1.8.16
celery==3.1.25
Django==1.8.16
django-celery==3.1.17
djangorestframework==3.5.3
django-filter==1.0.0
django-crispy-forms
2. 設置
在實際使用中我們使用分離式的設置,即celery的設置和初始化與tasks分開
首先,要設置django中的django.conf,即settings.py
djcelery.setup_loader()
# BROKER_URL = 'django://' # 直接使用django做broker生產環境不建議,建議使用redis或者rabbitMQ
BROKER_URL = 'redis://10.xx.xx.xx:6379/0' # broker使用reids
# 允許的格式
CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'yaml']
# CELERY_TASK_SERIALIZER = 'json'
# CELERY_RESULT_SERIALIZER = 'json'
# 定時任務
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
CELERY_RESULT_BACKEND = 'redis://10.xx.xx.xx:6379/1'
# 不用UTC
CELERY_ENABLE_UTC = False
CELERY_TIMEZONE = 'Asia/Shanghai'
# 任務結果的時效時間,默認一天
CELERY_TASK_RESULT_EXPIRES = 10
# log路徑
CELERYD_LOG_FILE = BASE_DIR + "/logs/celery/celery.log"
# beat log路徑
CELERYBEAT_LOG_FILE = BASE_DIR + "/logs/celery/beat.log"
以上是我在實際使用時的設置,更多詳細設置可以參考:celery文檔
創建celery.py文件
在django的項目目錄中創建celery.py(與settings.py在同一級目錄)
# -*- coding: UTF-8 -*-
import os
from celery import Celery
# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'syslog.settings')
from django.conf import settings # noqa
app = Celery('xxx')
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
這種設置方法可以讓celery自動在所有app中查找tasks文件,比較適合多人多APP同時開發的中大型項目
詳情參考:Using Celery with Django
創建實際任務tasks.py
首先,創建一個django的app,然後在app目錄下創建tasks.py文件
我們先簡單定義一個加法,用以說明後面的例子足夠了
from xxx.celery import app
@app.task
def add(x, y):
return x + y
創建調用djcelery model api的視圖
from djcelery import models as celery_models
def _create_task(name, task, task_args, crontab_time):
'''
name # 任務名字
task # 執行的任務 "myapp.tasks.add"
task_args # 任務參數 {"x":1, "Y":1}
crontab_time # 定時任務時間 格式:
{
'month_of_year': * # 月份
'day_of_month': * # 日期
'hour': * # 小時
'minute':*/2 # 分鐘
}
'''
# task任務, created是否定時創建
task, created = celery_models.PeriodicTask.objects.get_or_create(name=name, task=task)
# 獲取 crontab
crontab = celery_models.CrontabSchedule.objects.filter(**crontab_time).first()
if crontab is None:
# 如果沒有就創建,有的話就繼續複用之前的crontab
crontab = celery_models.CrontabSchedule.objects.create(**crontab_time)
task.crontab = crontab # 設置crontab
task.enabled = True # 開啓task
# task.args = [int(x) for x in json.loads(task_args)] # 傳入task參數
task.args = [int(x) for x in task_args] # 傳入task參數
task.save()
return True
def create(self, request, *args, **kwargs):
tasks = ['event_alarm', 'continue_event_alarm', 'condition_alarm']
blog_item_id = request.data['blogitem_id']
event_type = request.data['event_type']
intval = request.data['intval']
name = str(blog_item_id)
if event_type == 0:
task = '.'.join(['syslogalery.tasks', tasks[int(event_type)]])
elif event_type == 1:
task = '.'.join(['syslogalery.tasks', tasks[int(event_type)]])
elif event_type == 2:
task = '.'.join(['syslogalery.tasks', tasks[int(event_type)]])
task_args = [blog_item_id, intval]
crontab_time = {
'month_of_year': '*', # 月份
'day_of_month': '*', # 日期
'hour': '*', # 小時
'minute': '*/' + str(intval) # 分鐘
}
if _create_task(name, task, task_args, crontab_time):
return JsonResponse({'code': 200, 'msg': 'create success'})
else:
return JsonResponse({'code': 400, 'msg': 'failed'})
遇到的問題
在上述開發中,直接使用celery -A xxx worker -l info -B,自定義的定時不起作用,所有任務都是以5s爲定時執行,不太明白爲什麼
原因:見2017.2.24更新
解決方法:
使用以下命令啓動:
python2.7 manage.py celery -A syslog worker -B -l info
2017.2.24 更新
定時不生效問題解決方法
偶然一個機會,再次部署前面寫的使用celery定時調度的程序,發現又出現了定時不起作用,全是每個5s執行一次的問題,正好有時間就仔細查google下發現好多人遇到這個問題,並測試瞭解決方法:
針對UTC的設置:CELERY_ENABLE_UTC = True
celery issues#943: Celerybeat runs periodic tasks every 5 seconds regardless of interval
There’s definitely a bug somewhere in celery.schedules.crontab.remaining_delta() or surrounding it and it is related to timezones. I just have no idea what it is exactly.
The bug does not seem to reproduce when I specify CELERY_ENABLE_UTC=True so there’s might be a workaround for this bug.
Always make timezones aware in the schedule even if UTC is disabled #2666
Fixes #943.
@monax Please verify that this works for you.
In any event, setting CELERY_UTC_ENABLE to true is a good idea if you are using a Django version newer than 1.4.
上述更新後,也可以直接使用celery -A xxx worker -l info -B方式啓動了
supervisor 部署celery注意事項
推薦使用/usr/bin/celery -A xxx worker -B –logfile=/opt/logfilter/syslog/logs/celery/celery.log –loglevel=INFO
-l 方式已經不在推薦,在celery 4.x中將禁用-l遇到錯誤提示且celery沒有啓動成功:
Running a worker with superuser privileges when the
worker accepts messages serialized with pickle is a very bad idea!
解決方法:要用普通用戶啓動celery
部署時遇到django.core.exceptions.ImproperlyConfigured: The SECRET_KEY must not be empty.
- 原因: celery 默認找settings.py文件沒有找到
- 解決方案: 開發過程中通常會根據情況生成多個配置文件,但最後部署時一定要還原settings.py