码迷,mamicode.com
首页 > 其他好文 > 详细

异步调用与回调机制

时间:2018-03-12 13:37:07      阅读:134      评论:0      收藏:0      [点我收藏+]

标签:处理   pytho   对象   span   ack   网页爬虫   pos   from   方式   

 

提交任务的两种方式。

同步调用:提交完任务后,就在原地等待任务执行完毕,拿到结果,再执行下一行代码,导致程序是串行执行

异步调用:提交完任务后,不等待任务执行完毕

 

from concurrent.futures import ThreadPoolExecutor
import time,random

def la(name):
    print(%s is laing%name)
    time.sleep(random.randint(3,5))
    res = random.randint(7,13)*#
    return {name:name,res:res}

def weigh(shit):
    shit = shit.result() # 异步回掉时,处理接收到的对象
    name = shit[name]
    size = len(shit[res])
    print(%s 拉了 《%s》kg%(name,size))

if __name__ ==__main__:
    pool = ThreadPoolExecutor(13)

    # 同步调用
    # shit1 = pool.submit(la,‘alex‘).result()
    # weigh(shit1)
    # shit2 = pool.submit(la, ‘huhao‘).result()
    # weigh(shit2)
    # shit3 = pool.submit(la, ‘zhanbin‘).result()
    # weigh(shit3)

    # 异步调用
    pool.submit(la, alex).add_done_callback(weigh)
    pool.submit(la, huhao).add_done_callback(weigh)
    pool.submit(la, zhanbin).add_done_callback(weigh)

 

简单网页爬虫示例:

import requests,time
from concurrent.futures import ThreadPoolExecutor

def get(url):
    print(get url,url)
    response = requests.get(url)
    time.sleep(3)
    return {url:url,content:response.text}

def parse(res):
    res = res.result()
    print(%s parse res is %s%(res[url],len(res[content])))

if __name__ == __main__:
    urls = [
        http://www.cnblogs.com/stin,
        https://www.python.org,
        https://www.openstack.org,
    ]
    pool = ThreadPoolExecutor(2)
    for url in urls:
        pool.submit(get,url).add_done_callback(parse)

 

异步调用与回调机制

标签:处理   pytho   对象   span   ack   网页爬虫   pos   from   方式   

原文地址:https://www.cnblogs.com/stin/p/8548454.html

(0)
(0)
   
举报
评论 一句话评论(0
登录后才能评论!
© 2014 mamicode.com 版权所有  联系我们:gaon5@hotmail.com
迷上了代码!