multiprocessing模块使用和threading包类似的API接口来产生多进程,multiprocessing包提供本地和远程的并发,通过使用subprocesses(子进程)代替threads(线程)有效的避开了GIL(Global Interpreter Lock)。由于这一点,multiprocessing模块允许程序充分的利用多处理器。可以跨平台使用,包括Unix和Windows!----https://docs.python.org/2/library/multiprocessing.html
提醒: Some of this package’s functionality requires a functioning shared semaphore implementation on the host operating system. Without one, the multiprocessing.synchronize module will be disabled, and attempts to import it will result in an ImportError. See issue 3770 for additional information.
简单介绍完毕!开始学习Function。
# -*- coding: utf-8 -*- from multiprocessing import Process def f(name, Blog): print "hello", name print "Blog:", Blog if __name__ == '__main__': p = Process(target=f, args=("The_Third_Wave", "http://blog.csdn.net/zhanh1218")) p.start() p.join()结果如下:
hello The_Third_Wave Blog: http://blog.csdn.net/zhanh1218注意事项:请不要再IDLE中运行这段代码,否则你什么也看不到。请在自己的IDE中运行,我用的是eclipse。具体原因我还没搞明白。待补充
# -*- coding: utf-8 -*- from multiprocessing import Process import os, time, datetime, random from multiprocessing import Pool def task_1(name): print 'Run task_1 %s (%s)...' % (name, os.getpid()) time.sleep(random.randint(0, 3)) print 'id = %s over at %s' % (name, datetime.datetime.now()) def task_2(x): time.sleep(1) print "x = %s is run at %s" %(x, datetime.datetime.now()) return x*x if __name__ == '__main__': print 'Parent process pid is %s. Start at %s' % (os.getpid(), datetime.datetime.now()) p = Pool() for i in range(5): p.apply_async(task_1, args=(i,)) result = p.apply_async(task_2, [100]) print result.get(timeout=10) print p.map(task_2, range(10)) print "HERE Time is %s" %datetime.datetime.now() p.map(task_1, range(6, 10)) p.close() p.join() print 'All subprocesses done at %s' %datetime.datetime.now()结果为:
Parent process pid is 4852. Start at 2014-06-12 16:27:13.824000 10000 [0, 1, 4, 9, 16, 25, 36, 49, 64, 81] HERE Time is 2014-06-12 16:27:17.105000 Run task_1 4 (9104)... id = 4 over at 2014-06-12 16:27:16.047000 x = 6 is run at 2014-06-12 16:27:17.047000 x = 1 is run at 2014-06-12 16:27:16.105000 x = 8 is run at 2014-06-12 16:27:17.105000 Run task_1 0 (8604)... id = 0 over at 2014-06-12 16:27:14.044000 Run task_1 2 (8604)... id = 2 over at 2014-06-12 16:27:15.046000 x = 4 is run at 2014-06-12 16:27:16.105000 x = 7 is run at 2014-06-12 16:27:17.105000 x = 0 is run at 2014-06-12 16:27:16.105000 x = 9 is run at 2014-06-12 16:27:17.105000 x = 100 is run at 2014-06-12 16:27:15.097000 x = 5 is run at 2014-06-12 16:27:16.105000 Run task_1 7 (4152)... id = 7 over at 2014-06-12 16:27:18.105000 x = 2 is run at 2014-06-12 16:27:16.105000 Run task_1 6 (7524)... id = 6 over at 2014-06-12 16:27:18.105000 x = 3 is run at 2014-06-12 16:27:16.105000 Run task_1 8 (7500)... id = 8 over at 2014-06-12 16:27:18.105000 Run task_1 1 (7392)... id = 1 over at 2014-06-12 16:27:14.045000 Run task_1 3 (7392)... id = 3 over at 2014-06-12 16:27:17.046000 Run task_1 9 (7392)... id = 9 over at 2014-06-12 16:27:20.105000 All subprocesses done at 2014-06-12 16:27:20.149000join([timeout])阻碍线程调用直到进程的join()方法被调用!调用join前必须前调用close方法。close方法阻止向进程池中提交任务,也就是不能继续添加新进程!
from multiprocessing import Process, Queue def f(q): q.put("Hello The_Third_Wave") if __name__ == '__main__': q = Queue() p = Process(target=f, args=(q,)) p.start() print q.get() p.join()
<pre name="code" class="python">结果为:Hello The_Third_WaveQueues are thread and process safe.
from multiprocessing import Process, Pipe def f(conn): conn.send("Hello The_Third_Wave") conn.close() if __name__ == '__main__': parent_conn, child_conn = Pipe() p = Process(target=f, args=(child_conn,)) p.start() print parent_conn.recv() p.join()
# -*- coding: utf-8 -*- from multiprocessing import Process, Lock import os, time, datetime, random def task_1(lock, name): lock.acquire() print 'Run task_1 %s (%s)...' % (name, os.getpid()) time.sleep(random.randint(1, 3)) print 'id = %s over at %s' % (name, datetime.datetime.now()) lock.release() if __name__ == '__main__': print 'Parent process pid is %s. Start at %s' % (os.getpid(), datetime.datetime.now()) lock = Lock() for i in range(5): Process(target=task_1, args=(lock, i)).start() print 'Parent process done at %s' %datetime.datetime.now()结果为:
Parent process pid is 7908. Start at 2014-06-12 17:09:12.726000 Parent process done at 2014-06-12 17:09:12.861000 Run task_1 1 (7104)... id = 1 over at 2014-06-12 17:09:14.892000 Run task_1 0 (2440)... id = 0 over at 2014-06-12 17:09:15.892000 Run task_1 2 (8812)... id = 2 over at 2014-06-12 17:09:16.892000 Run task_1 3 (2552)... id = 3 over at 2014-06-12 17:09:19.892000 Run task_1 4 (2172)... id = 4 over at 2014-06-12 17:09:22.893000可以看到主进程执行完毕,因为锁的存在,子进程还在逐步执行!
As mentioned above, when doing concurrent programming it is usually best to avoid using shared state as far as possible. This is particularly true when using multiple processes.
However, if you really do need to use some shared data then multiprocessing provides a couple of ways of doing so.
未完待续。。。
本文由@The_Third_Wave(Blog地址:http://blog.csdn.net/zhanh1218)原创。不定期更新,有错误请指正。
Sina微博关注:@The_Third_Wave
如果这篇博文对您有帮助,为了好的网络环境,不建议转载,建议收藏!如果您一定要转载,请带上后缀和本文地址。
Python多进程(multiprocessing)学习总结,布布扣,bubuko.com
Python多进程(multiprocessing)学习总结
原文地址:http://blog.csdn.net/zhanh1218/article/details/30248023