码迷,mamicode.com
首页 > 其他好文 > 详细

Django 1.9 + celery + django-celry 实现定时任务

时间:2017-07-09 19:38:48      阅读:303      评论:0      收藏:0      [点我收藏+]

标签:lambda   redis   ini   tap   mod   mat   sel   ...   before   

celery可以进行任务异步处理,celery还有一种Celery的常用模式便是执行定期任务. 执行定期任务时, Celery会通过celerybeat进程来完成. Celerybeat会保持运行, 一旦到了某一定期任务需要执行时, Celerybeat便将其加入到queue中. 
配置

 

 

那么我们如何让他和django搭配着使用呢

 

其实很简单拿一个项目来说吧

项目介绍

celery==3.1.23

Django==1.9

django-celery==3.1.17

flower==0.9.2

请严格按照版本安装,否则出现故障请自行解决,版本坑不是你很容易发现的

创建一个虚拟环境

virtualenv my-celery_v1 -p python2.7
pip install django==1.9 django-celery==3.1.17 celery==3.1.23 flower==0.9.2 redis
cd my-celery_v1/
source bin/active
django-admin startproject proj
cd  proj
python manage.py startapp demoapp

 

修改配置文件settings.py


INSTALLED_APPS = [
‘django.contrib.admin‘,
‘django.contrib.auth‘,
‘django.contrib.contenttypes‘,
‘django.contrib.sessions‘,
‘django.contrib.messages‘,
‘django.contrib.staticfiles‘,
‘djcelery‘,
‘demoapp‘,

]

1
import djcelery 2 djcelery.setup_loader() 3 BROKER_URL = redis://localhost:6379 4 CELERYBEAT_SCHEDULER = djcelery.schedulers.DatabaseScheduler # 定时任务 5 CELERY_RESULT_BACKEND = djcelery.backends.database:DatabaseBackend 6 # CELERY_RESULT_BACKEND = ‘redis://localhost:6379‘ 7 CELERY_ACCEPT_CONTENT = [application/json] 8 CELERY_TASK_SERIALIZER = json 9 CELERY_RESULT_SERIALIZER = json 10 CELERY_TIMEZONE = Asia/Shanghai 11 CELERY_LOG_FILE = os.path.join(os.path.join(os.path.join(BASE_DIR, logs), celery), celery.log) 12 CELERYBEAT_LOG_FILE = os.path.join(os.path.join(os.path.join(BASE_DIR, logs), celery), beat.log)

 

创建celery.py文件

#!/bin/python
#-*- coding:utf-8 -*-

from __future__ import absolute_import

import os

from celery import Celery,platforms

os.environ.setdefault(DJANGO_SETTINGS_MODULE, proj.settings)
# Specifying the settings here means the celery command line program will know where your Django project is.
# This statement must always appear before the app instance is created, which is what we do next:
from django.conf import settings

app = Celery(proj)

app.config_from_object(django.conf:settings)
platforms.C_FORCE_ROOT = True
# This means that you don’t have to use multiple configuration files, and instead configure Celery directly from the Django settings.
# You can pass the object directly here, but using a string is better since then the worker doesn’t have to serialize the object.

app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)


# With the line above Celery will automatically discover tasks in reusable apps if you define all tasks in a separate tasks.py module.
# The tasks.py should be in dir which is added to INSTALLED_APP in settings.py.
# So you do not have to manually add the individual modules to the CELERY_IMPORT in settings.py.

@app.task(bind=True)
def debug_task(self):
    print(Request: {0!r}.format(self.request))  # dumps its own request information

在proj/__init__.py

#!/bin/python
from __future__ import absolute_import

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app
__all__ = [celery_app]

在项目demoapp/tasks.py

from __future__ import absolute_import

import time
from celery import task

from celery import shared_task


# from celery.task import tasks
# from celery.task import Task

# @task()
@shared_task
def add(x, y):
    print "%d + %d = %d" % (x, y, x + y)
    return x + y


# class AddClass(Task):
#    def run(x,y):
#        print "%d + %d = %d"%(x,y,x+y)
#        return x+y
# tasks.register(AddClass)

@shared_task
def mul(x, y):
    print "%d * %d = %d" % (x, y, x * y)
    return x * y


@shared_task
def sub(x, y):
    print "%d - %d = %d" % (x, y, x - y)
    return x - y


@task
def sendmail(mail):
    print(sending mail to %s... % mail)
    time.sleep(2.0)
    print(mail sent.)
    print "------------------------------------"
    return mail

 

运行之前创建logs/celery 日志文件夹同步数据库和创建超级用户

python manage.py migrate 
python manage.py createsuperuser

 

首先启动worker没报错正常

python manage.py celery worker --loglevel=info --settings=proj.settings  --autoreload

技术分享

 

启动计划任务

python manage.py celery beat

 

技术分享

 

另外为更好的展示任务执行情况我们还是用了flower作为页面展示,使用5555端口访问

python manage.py celery flower -l info --settings=proj.settings

 

技术分享

 

启动django

django manage.py runserver

 

技术分享

执行结果:

技术分享

更直观的执行结果127.0.0.1:5555

技术分享

 

Django 1.9 + celery + django-celry 实现定时任务

标签:lambda   redis   ini   tap   mod   mat   sel   ...   before   

原文地址:http://www.cnblogs.com/gavin002/p/7142168.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!