parallelism vs concurrency

preview_player
Показать описание
In this short I talk about the differences of parallelism and concurrency and use go lang #programming #golangprogramming
Рекомендации по теме
Комментарии
Автор

Concurrency is using multiple threads to accomplish multiple tasks, usually independent of each other - IE using a go routine to send an email verification after a checkout completion. The user shouldn't have to wait for that process to be done before getting the server response.

Parallelism is splitting a larger task into smaller pieces, then using concurrency to handle the smaller pieces across multiple threads. So the excel sheet example. Say you split a 10000 row sheet into 2500 row chunks, and have each thread handle its own 2500 row chunk and these threads are just adding something to a shared mutex variable or something. The key difference here is that these 4 threads are essentially acting as a single block working on the same task, so all 4 threads must finish their individual tasks before the process can continue blocking. This means that if 3 threads each take 1 second to complete, and 1 thread takes 5 seconds to complete (for some reason), then the total time for this particular example would be ~5 seconds.

This video came across as some word salad without really explaining anything. His example diagrams were a blocking process - the vertical one, compared to concurrent one - the horizontal one. This was just a bad video, unfortunately.

datguy
Автор

Analogy: cooking
Synchronous - you watch the water until it boils. You then get the spaghetti. You then put it in the water, etc.
Concurrent - As you wait for the water to boil, you do other tasks, such as chopping an onion.
Parallel - professional kitchen with multiple chefs. Food can be cooked faster, but it is harder to know what is going on, and communication is MUCH slower.

travisSimon
Автор

I was understanding the difference, now I don't lol.

Salloom
Автор

My understanding is that concurrency performs different tasks at once by using multiple threads and switching back and forth between them, which is context switching. It does one part from task 1, then does one part from task 2, then task 3, then repeat blah blah. And it can all be done on one core.

With parallelism, you use multiple cores to execute instructions literally at the same time.

pepperdayjackpac
Автор

Simply put, Parallelism is having multiple tasks run in parallel. Concurrency is more general. It can be achieved though parallelism or any other asynchronous mechanisms like coroutines, event loop, and so on.

j.r.r.tolkien
Автор

Someone mentioned cooking. I wanted to share my thoughts on that analogy.

If you're alone like me, wash the rice, turn on the cooker to boil it. Now boiling is like network call an external entity out of my control. Instead of just waiting for it finish boiling i could be doing something else like chopping veggies or smoking

Am i doing parallel things? Nope. Im switching from boiling and then to chopping instead of waiting.

If i had a girlfriend, once i start boiling, i can go smoke and at the same time my girlfriend can chop the veggies. We're both doing a task each at the same time.

masterchief
Автор

so parallelism is multiple threads. concurrency is still one thread?

sx.
Автор

How many technical words can I say without explaining anything?🤔

cristichifan
Автор

"You can paralyze the task" yep!

coocoobau
Автор

Imagine a single core system. By definition parallelism isnt possible.

We have this thread to make a network call but it might be slow due to latency. Instead of waiting OS can switch to process some data that was already fetched.

So this gives us the illusion of two things happeneing at the same time but only one thing is happening. Once the data is fetched. OS switches back to fetching more data and again use the idle time to process some data.

Start work 1, wait for w1 to complete. Meanwhile process w1 data until w1 fetches more data.

This is one cycle. At any point one of the work is using the CPU.

When the data is fetched, that is the response came from network call and system needs to read that data into memory at this point the original data processing job cannot run. It will resume once the CPU goes to idle state afte a new network call.


Now you got your first internship and you upgraded to 2 core system.
This means you can run 2 such Cycles with each cycle essentially doing 2 jobs concurrently.

At any point only 2 jobs are running independently. If you get the cycle Structure right. You can parallize cuz each cycle is independent and can run in their memory space. Where has two jobs in a cycle share the same space. In go they share memory by communicating. Once job1 gets the data and makes a new network call. It passes signals the go routine thread saying hey process this new data while we wait for network call to finish.

Conclusion:

Concurrency: Job1 - wait - job2 - data ready - initiate job 1 - use idle time to process job2

Once we have the above pattern.

Parallelism: runnign multiple of the above pattern simultaneously on multi core systems.

masterchief
Автор

Everyone who is confused should just watch Rob Pike's talk, "Concurrency is not Parallelism".

zombiefacesupreme
Автор

Did you not just explain the same thing twice?

radi_dev
Автор

I love JS but, when we talk about this kind of shit I need to know how the memory looks like and how it’s accessed

sadunozer
Автор

Got it got it

Didn't get it didn't get.

saiphaneeshk.h.
Автор

Can I interpret this as a server handle multiple incoming requests in parallel each request being handle concurrently.

tranquangthang
Автор

Doesn’t parallelism depend on the number of cores??

abhishekbose