Processing 25GB of data in Spark | How many Executors and how much Memory per Executor is required.

preview_player
Показать описание
#pyspark #azuredataengineer #databricks #spark

Use the below link to enroll for our free materials and other course.

You can talk to me directly on Topmate by using the below link:

Follow me on LinkedIn
-----------------------------------------------------------------------------
Clever Studies Official WhatsApp Group joining link:
--------------------------------------------------
Follow this link to join 'Clever Studies' official telegram channel:
--------------------------------------------------

PySpark by Naresh playlist:
--------------------------------------------------
Realtime Interview playlist:
--------------------------------------------------
Apache Spark playlist:
--------------------------------------------------
PySpark playlist:

Hello Viewers,

We ‘Clever Studies’ YouTube Channel formed by group of experienced software professionals to fill the gap in the industry by providing free content on software tutorials, mock interviews, study materials, interview tips, knowledge sharing by Real-time working professionals and many more to help the freshers, working professionals, software aspirants to get a job.

If you like our videos, please do subscribe and share within your circle.

Thank you!
Рекомендации по теме
Комментарии
Автор

plz make video on pyspark unit testing

shivamchandan
Автор

Is it that each core would take 4 * partition size memory ?

shibhamalik
Автор

what if we have limited resource? what configuration would you recommend to process 25GB? (16 cores and 32GB)

kingoyster
Автор

If num of partition is 200 ... And so it the number of core required ... So core size is 128mb ... Right ?

Then how in 3rd block core size turn to 512mb and thus executer is then 4*512 ????

adityac
Автор

Hi,

Does the same study applies if we are working in Data Bricks?

nvhoukb
Автор

for example you are assigning 25 executors instead of 50 then in each executors there will be 8 cores and parallel task will be run(25*8). Then also it will take 5 mins only to complete the job then how 10min. can you please explain this point once again?

dvrycse
Автор

Sir, I want to join Job ready program.How to join .Link is not enabled.pls help

kamatchiprabu
Автор

There are 200 cores in total . Each core will use one partition at a time so will use 128MB
Each executor has 4 core so each executor requires 4*128 MB which is 512 mb. Where does extra 4 multiplier came from ?😊

shibhamalik
Автор

In my company the cpu per executor is 5 min and 8 max.

Fresh-shgc
Автор

What is use of giving each core 512 mb, if blcok size is 128 MB.
Each block process on a single core, so if each block is 128 mb, why we should give 512mb
To each core?

There will be wastage of memory, Am I right?

Please explain this.

Thanks

Amarjeet-fblk