Python 的多处理包中的队列和管道之间的根本区别是什么?
在什么情况下应该选择一个而不是另一个?什么时候使用有好处Pipe()?什么时候使用有好处Queue()?
Pipe()
Queue()
APipe()只能有两个端点。
AQueue()可以有多个生产者和消费者。
何时使用它们
如果您需要两个以上的点进行通信,请使用Queue().
如果您需要绝对性能,aPipe()会快得多,因为Queue()它构建在Pipe().
绩效基准
假设您想生成两个进程并尽快在它们之间发送消息。这些是使用Pipe()和的类似测试之间的拉力赛的计时结果Queue()......这是在运行 Ubuntu 11.10 和 Python 2.7.2 的 ThinkpadT61 上。
仅供参考,我将结果JoinableQueue()作为奖励;JoinableQueue()在调用时考虑任务queue.task_done()(它甚至不知道具体的任务,它只计算队列中未完成的任务),所以queue.join()知道工作已经完成。
JoinableQueue()
queue.task_done()
queue.join()
此答案底部的每个代码…
mpenning@mpenning-T61:~$ python multi_pipe.py Sending 10000 numbers to Pipe() took 0.0369849205017 seconds Sending 100000 numbers to Pipe() took 0.328398942947 seconds Sending 1000000 numbers to Pipe() took 3.17266988754 seconds mpenning@mpenning-T61:~$ python multi_queue.py Sending 10000 numbers to Queue() took 0.105256080627 seconds Sending 100000 numbers to Queue() took 0.980564117432 seconds Sending 1000000 numbers to Queue() took 10.1611330509 seconds mpnening@mpenning-T61:~$ python multi_joinablequeue.py Sending 10000 numbers to JoinableQueue() took 0.172781944275 seconds Sending 100000 numbers to JoinableQueue() took 1.5714070797 seconds Sending 1000000 numbers to JoinableQueue() took 15.8527247906 seconds mpenning@mpenning-T61:~$
总而言之Pipe(),它比 a 快大约三倍Queue()。甚至不要考虑,JoinableQueue()除非你真的必须有好处。
奖励材料 2
多处理在信息流中引入了细微的变化,这使得调试变得困难,除非你知道一些捷径。例如,您可能有一个脚本在许多条件下通过字典索引时工作正常,但很少会因某些输入而失败。
通常当整个python进程崩溃时我们会得到失败的线索;但是,如果多处理功能崩溃,您不会将未经请求的崩溃回溯打印到控制台。如果不知道是什么导致了进程崩溃,就很难追踪未知的多处理崩溃。
我发现追踪多处理崩溃信息的最简单方法是将整个多处理函数包装在try/中except并使用traceback.print_exc():
try
except
traceback.print_exc()
import traceback def run(self, args): try: # Insert stuff to be multiprocessed here return args[0]['that'] except: print "FATAL: reader({0}) exited while multiprocessing".format(args) traceback.print_exc()
现在,当您发现崩溃时,您会看到如下内容:
FATAL: reader([{'crash': 'this'}]) exited while multiprocessing Traceback (most recent call last): File "foo.py", line 19, in __init__ self.run(args) File "foo.py", line 46, in run KeyError: 'that'
源代码:
""" multi_pipe.py """ from multiprocessing import Process, Pipe import time def reader_proc(pipe): ## Read from the pipe; this will be spawned as a separate Process p_output, p_input = pipe p_input.close() # We are only reading while True: msg = p_output.recv() # Read from the output pipe and do nothing if msg=='DONE': break def writer(count, p_input): for ii in xrange(0, count): p_input.send(ii) # Write 'count' numbers into the input pipe p_input.send('DONE') if __name__=='__main__': for count in [10**4, 10**5, 10**6]: # Pipes are unidirectional with two endpoints: p_input ------> p_output p_output, p_input = Pipe() # writer() writes to p_input from _this_ process reader_p = Process(target=reader_proc, args=((p_output, p_input),)) reader_p.daemon = True reader_p.start() # Launch the reader process p_output.close() # We no longer need this part of the Pipe() _start = time.time() writer(count, p_input) # Send a lot of stuff to reader_proc() p_input.close() reader_p.join() print("Sending {0} numbers to Pipe() took {1} seconds".format(count, (time.time() - _start)))
""" multi_queue.py """ from multiprocessing import Process, Queue import time import sys def reader_proc(queue): ## Read from the queue; this will be spawned as a separate Process while True: msg = queue.get() # Read from the queue and do nothing if (msg == 'DONE'): break def writer(count, queue): ## Write to the queue for ii in range(0, count): queue.put(ii) # Write 'count' numbers into the queue queue.put('DONE') if __name__=='__main__': pqueue = Queue() # writer() writes to pqueue from _this_ process for count in [10**4, 10**5, 10**6]: ### reader_proc() reads from pqueue as a separate process reader_p = Process(target=reader_proc, args=((pqueue),)) reader_p.daemon = True reader_p.start() # Launch reader_proc() as a separate python process _start = time.time() writer(count, pqueue) # Send a lot of stuff to reader() reader_p.join() # Wait for the reader to finish print("Sending {0} numbers to Queue() took {1} seconds".format(count, (time.time() - _start)))
""" multi_joinablequeue.py """ from multiprocessing import Process, JoinableQueue import time def reader_proc(queue): ## Read from the queue; this will be spawned as a separate Process while True: msg = queue.get() # Read from the queue and do nothing queue.task_done() def writer(count, queue): for ii in xrange(0, count): queue.put(ii) # Write 'count' numbers into the queue if __name__=='__main__': for count in [10**4, 10**5, 10**6]: jqueue = JoinableQueue() # writer() writes to jqueue from _this_ process # reader_proc() reads from jqueue as a different process... reader_p = Process(target=reader_proc, args=((jqueue),)) reader_p.daemon = True reader_p.start() # Launch the reader process _start = time.time() writer(count, jqueue) # Send a lot of stuff to reader_proc() (in different process) jqueue.join() # Wait for the reader to finish print("Sending {0} numbers to JoinableQueue() took {1} seconds".format(count, (time.time() - _start)))