任务队列用于分发工作给不同线程或机器。
Celery通过消息传递
支持多个workers和brokers。提供高可用和水平扩展性。
用Python写的
Brokers
Result Stores
Concurrency
Serialization
pip install celery
pip install beanstalkc
$ sudo apt-get install rabbitmq-server
$ sudo rabbitmqctl add_user myuser mypassword
$ sudo rabbitmqctl add_vhost myvhost
$ sudo rabbitmqctl set_permissions -p myvhost myuser ".*" ".*" ".*"
首先你要创建Celery实例。
from celery import Celery
BROKER_URL = ‘amqp://guest:guest@localhost:5672//‘
app = Celery(‘tasks‘, broker=BROKER_URL)
@app.task
def add(x, y):
return x + y
$ celery -A tasks worker --loglevel=info
通过使用supervisord守护进程执行
>>> from tasks import add
>>> add.delay(4, 4)
app = Celery(‘tasks‘, backend=‘amqp‘, broker=‘amqp://‘)
>>> result = add.delay(4, 4)
>>> result.ready()
True
>>> result.get(timeout=1)
8
原文地址:http://www.cnblogs.com/erhuabushuo/p/3820331.html