Multiprocessing in Python: Pool

preview_player
Показать описание
This video is sponsored by Oxylabs. Oxylabs provides market-leading web scraping solutions for large-scale public data gathering. You can receive data in JSON or CSV format and pay only per successful request. At the moment, Oxylabs offers a free trial.

In this video, we will be continuing our treatment of the multiprocessing module in Python. Specifically, we will be taking a look at the "Pool" class, and how we can go about using the Pool class to instantiate tasks that are run across multiple processors on our machine.

"multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads. Due to this, the multiprocessing module allows the programmer to fully leverage multiple processors on a given machine. It runs on both Unix and Windows."

Software from this video:

For more videos on multiprocessing:

Do you like the development environment I'm using in this video? It's a customized version of vim that's enhanced for Python development. If you want to see how I set up my vim, I have a series on this here:

If you've found this video helpful and want to stay up-to-date with the latest videos posted on this channel, please subscribe:
Рекомендации по теме
Комментарии
Автор

I have looked at A LOT of python parallel videos and this is the first one that worked! I also added a print line to the serial function to confirm the output; they were consistent. Furthermore, it worked well and can be adapted easily for more advanced applications. Thank you!

davidmkahler
Автор

Finally a clear tutorial on this! Was finally able to get it to work thanks to you.

thetnlnk
Автор

you are best teacher . Greetings from Azerbaijan

cavidanbagiri
Автор

This is a phenomenally well made video, well done sir on your presentation and clarity.

briananderson
Автор

thanks for the video bro, I was looking for several and several and could not understand as clearly as it was in this video!

ninjahkz
Автор

Nice work!! Thanks for this series of videos. Greetings from Chile

franciscon
Автор

I had problem with real time OCR and QR detection. when using single thread, it takes time save the images and process the OCR and QR which make the screen freeze about 0.3 seconds . so I used multiprocessing and make two process to capture images and process QR OCR and one shared list to put and read numpy arrays (ROI). It works better. Thank you very much.

buffaloofm
Автор

Once again ! Very interesting video ! Thank you very much

Optisoins
Автор

Great example! I have almost faced up with the same situation where I expected to speed up my execution time using Multiprocessing however I got longer run time! Thanks.

ashkanajrian
Автор

Excellent video, greetings from Germany

schogaia
Автор

It is work mentioning that it isn't possible to call p.join() before calling p.call(). At least it isn't possible, based on the official Python documentation.

talbarak
Автор

what about compared to threading?
which is faster and whats the difference

novianindy
Автор

Excellent. I'm trying to implement this along with a timeout that will kill processes that are running longer than a set time. If anyone knows of a video that shows how to do that please pass it on.

ighsight
Автор

Under what condition do you recommend using Pool as opposed to Process?

chuanjiang
Автор

How to explicitily limit the number of process, like I want to run only 3 process at a time .

HeadphoneYT
Автор

The code at 5:00 gives me this output:
[0, 0, 1, 5, 14]
Traceback (most recent call last):
File "main.py", line 17, in <module>
p.join()
File "/usr/lib/python3.8/multiprocessing/pool.py", line 659, in join
raise ValueError("Pool is still running")
ValueError: Pool is still running

socialrupt
Автор

Nice, one comment, you can use double the process of the number of cores so 16 cores is 32 process you can run.

phillipotey
Автор

Thanks for the great contents. I'm really learning a lot from these videos. I have a question, is there a smart way to give pool class multiple arguments? For example, in this video, function 'sum_square' takes 1 integer as an argument and to execute parallel computation we make a list of integers[numbers] and use 'p.map(sum_square, numbers)' . What I see is that the map function takes the function to parallelize and then a list of arguments. But for a function that takes multiple arguments other than only one, lets say for 2 integers, how do you design the pool multiprocessing?

박동연-tw
Автор

Thank you so much for your demonstration! Very clear and helpful. Can I ask why do we need the line "if __name__ == '__mian__' "? Thank you!

summerxia
Автор

How to solve broken process pool on multiprocessing?

utayasurian