标签:art imp request awl app nbsp article ## middle
virtualenv scrapyEnv pip install scrapy requests pywin32 selenium
scrapy startproject ArticleSpider
scrapy genspider -t crawl spidername 域名
scrapy genspider spidername 域名
在scrapy项目主目录下新建main.py文件写入下述代码,既可通过脚本运行。
from scrapy.cmdline import execute
import sys
import os
# sys.path.append("C:\Users\CZN\PycharmProjects\ArticleSpider")可行
# print(os.path.dirname(os.path.abspath(__file__)))
sys.path.append(os.getcwd())#获得ArticleSpider项目所在路径
# execute(["scrapy","crawl","jobbole"])##scrapy crawl jobbole
标签:art imp request awl app nbsp article ## middle
原文地址:https://www.cnblogs.com/zenan/p/9050341.html