Multiprocessing in Python: Process Communication

preview_player
Показать описание
In this video, we will be continuing our treatment of the multiprocessing module in Python. Specifically, we will be taking a look at how to use the Queue class in multiprocessing to communicate among different processes being run on different processors.

"multiprocessing is a package that supports spawning processes using an API similar to the threading module. The multiprocessing package offers both local and remote concurrency, effectively side-stepping the Global Interpreter Lock by using subprocesses instead of threads. Due to this, the multiprocessing module allows the programmer to fully leverage multiple processors on a given machine. It runs on both Unix and Windows."

Software from this video:

For more videos on multiprocessing:

Do you like the development environment I'm using in this video? It's a customized version of vim that's enhanced for Python development. If you want to see how I set up my vim, I have a series on this here:

If you've found this video helpful and want to stay up-to-date with the latest videos posted on this channel, please subscribe:
Рекомендации по теме
Комментарии
Автор

Thank you for preparing this tutorial! I've been reading multiprocessing documentation for a month, but it still hadn't quite sunk in yet.

davidgolembiowski
Автор

Thank you for this high-quality series of video.

tryfonmichalopoulos
Автор

I feel like Neo (The Matrix). "Now I know inter-process communication with Python Multiprocessing". Awesome explanation and well articulated. I've got to check out the rest of your videos.

johnmayorga
Автор

very infomational series. Cheers mate!

Rohitnansen
Автор

Vincent-Superb intro series on Python MP. I have worked through and am now applying on real algos with big datasets. I would love to see the multiprocessing series developed further. eg beyond Queue(), how to use shared resources (eg a master pandas df) between Pool workers, where each pool worker writes its output to a single shared df (it is a very large df!). Also where each pool worker accesses as input a large shared df. Or if you can point to a resource which might outline this? Again, fantastic series...

markd
Автор

Thank you for the video. I have a question: How can I know were exactly are the results in the Queue regarding one funcion or the other? In this example, the processes are running in parallel and are putting the results in an arbritary order (i.e. the first one puts is result). Do I allow for the funcions to put the results in an arbritary order and try to gess the order later or do I controll the order that they put the data into the queue? Or other: can I have more than one queue, one for each process? Thank you!

hugonogueira
Автор

1) Would you recommend putting a lock around the math operation within one of the two functions? Or a lock within both functions even? Or do the functions just take copies of numbers so that it's not necessary?
2) If I want to execute the two functions periodically (process A once per 1s, process B once per 5s), would you recommend setting up a timer inside a third function in another pool?
3) In this example, the process B requires input from process A. How could I ensure that B always takes the latest value from A? Because with a queue it would go FIFO and take the earliest, (outdated) one, right?

Thanks for the video!

ErikBlueGuitar
Автор

thanks for this tutorial. i have a question please, if i want to prcoess a frame in two functions, one function perform human detection and return the bounding box coordinates of the detected bodies, and second function perform face detection and return bounding box coordinates of detected faces, can i use queue to save these coordinates of the two functions and then use them later(when both functions finishes) to draw these bounding boxes on the original frame?

MuhannadGhazal
Автор

Thank you for a great set of lessons. One area that I am having difficulty with is using process_affinity on Windows 10 using Python. Do you have any examples?
I would like to setup multiple processes to run data acquisition. These processes need to be time determinate and to eliminate as much latency as possible I would like each process to run on a specific core with a high priority.

philipjohnson
Автор

Do you have example using 3D numpy.array

chenwu
Автор

Is there any difference between "putting" stuff in a queue and appending stuff to a list?

chesshooligan
Автор

So if i have processes that just throw in stuff, i cant really be sure about the order they come in, thats what i think after watching your video. It is well explained but still not helpfull(! How can i then use the queue to exchange data between those processes?

HellesMammuthBS
Автор

Hi @LucidProgramming

what if I want my second process to run only twice? The queue has 10, for example but I still want to control it and run it only twice. I mean I have a loop inside my second function but if I enclose it in to while with a counter, until it reaches 2. But it doesn't work. If I remove the while, the loop will go through all the queue items... How can I control it into running twice only. Sorry, for redundancy... :)
Cheers

transfer
Автор

Do you think you will continue this video series? Like learn about multiprocessing more in-depth?

darkcloud