标签:ret get pytho project crawler 调用 process 环境 tutorial
我的环境: celery 3.1.25 python 3.6.9 window10
celery tasks 代码如下,其中 QuotesSpider 是我的scrapy项目爬虫类名称
from celery_app import app
from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
from tutorial.spiders.quotes import QuotesSpider
def crawl_run():
scope = ‘all‘
process = CrawlerProcess(settings=get_project_settings())
process.crawl(QuotesSpider, scope)
process.start()
process.join()
@app.task(queue=‘default‘)
def execute_task():
return crawl_run()
标签:ret get pytho project crawler 调用 process 环境 tutorial
原文地址:https://www.cnblogs.com/WalkOnMars/p/11558560.html