filmov
tv
5. kpmg pyspark interview question & answer | databricks scenario based interview question & answer
Показать описание
#Databricks #PysparkInterviewQuestions #deltalake
Azure Databricks #spark #pyspark #azuredatabricks #azure
In this video, I discussed KPMG PySpark scenario based interview questions and answers.
PySpark advanced interview questions answers?
databricks interview questions and answers?
kpmg pyspark interview questions and answers?
Create dataframe:
======================================================
#Employees Salary info
data1=[(100,"Raj",None,1,"01-04-23",50000),
(200,"Joanne",100,1,"01-04-23",4000),(200,"Joanne",100,1,"13-04-23",4500),(200,"Joanne",100,1,"14-04-23",4020)]
schema1=["EmpId","EmpName","Mgrid","deptid","salarydt","salary"]
display(df_salary)
#department dataframe
data2=[(1,"IT"),
(2,"HR")]
schema2=["deptid","deptname"]
display(df_dept)
-----------------------------------------------------------------------------------------------------------------------
display(df)
---------------------------------------------------------------------------------------------------------------------
#display(df1)
col('b.EmpName').alias('ManagerName'),
col('a.EmpName'),
col('a.Newsaldt'),
)
display(df2)
-------------------------------------------------------------------------------------------------------------------
display(df3)
============================================================
Learn PySpark, an interface for Apache Spark in Python. PySpark is often used for large-scale data processing and machine learning.
Azure data factory tutorial playlist:
ADF interview question & answer:
1. pyspark introduction | pyspark tutorial for beginners | pyspark tutorial for data engineers:
2. what is dataframe in pyspark | dataframe in azure databricks | pyspark tutorial for data engineer:
3. How to read write csv file in PySpark | Databricks Tutorial | pyspark tutorial for data engineer:
4. Different types of write modes in Dataframe using PySpark | pyspark tutorial for data engineers:
5. read data from parquet file in pyspark | write data to parquet file in pyspark:
6. datatypes in PySpark | pyspark data types | pyspark tutorial for beginners:
7. how to define the schema in pyspark | structtype & structfield in pyspark | Pyspark tutorial:
8. how to read CSV file using PySpark | How to read csv file with schema option in pyspark:
9. read json file in pyspark | read nested json file in pyspark | read multiline json file:
10. add, modify, rename and drop columns in dataframe | withcolumn and withcolumnrename in pyspark:
11. filter in pyspark | how to filter dataframe using like operator | like in pyspark:
12. startswith in pyspark | endswith in pyspark | contains in pyspark | pyspark tutorial:
13. isin in pyspark and not isin in pyspark | in and not in in pyspark | pyspark tutorial:
14. select in PySpark | alias in pyspark | azure Databricks #spark #pyspark #azuredatabricks #azure
15. when in pyspark | otherwise in pyspark | alias in pyspark | case statement in pyspark:
16. Null handling in pySpark DataFrame | isNull function in pyspark | isNotNull function in pyspark:
17. fill() & fillna() functions in PySpark | how to replace null values in pyspark | Azure Databrick:
18. GroupBy function in PySpark | agg function in pyspark | aggregate function in pyspark:
19. count function in pyspark | countDistinct function in pyspark | pyspark tutorial for beginners:
20. orderBy in pyspark | sort in pyspark | difference between orderby and sort in pyspark:
21. distinct and dropduplicates in pyspark | how to remove duplicate in pyspark | pyspark tutorial:
Azure Databricks #spark #pyspark #azuredatabricks #azure
In this video, I discussed KPMG PySpark scenario based interview questions and answers.
PySpark advanced interview questions answers?
databricks interview questions and answers?
kpmg pyspark interview questions and answers?
Create dataframe:
======================================================
#Employees Salary info
data1=[(100,"Raj",None,1,"01-04-23",50000),
(200,"Joanne",100,1,"01-04-23",4000),(200,"Joanne",100,1,"13-04-23",4500),(200,"Joanne",100,1,"14-04-23",4020)]
schema1=["EmpId","EmpName","Mgrid","deptid","salarydt","salary"]
display(df_salary)
#department dataframe
data2=[(1,"IT"),
(2,"HR")]
schema2=["deptid","deptname"]
display(df_dept)
-----------------------------------------------------------------------------------------------------------------------
display(df)
---------------------------------------------------------------------------------------------------------------------
#display(df1)
col('b.EmpName').alias('ManagerName'),
col('a.EmpName'),
col('a.Newsaldt'),
)
display(df2)
-------------------------------------------------------------------------------------------------------------------
display(df3)
============================================================
Learn PySpark, an interface for Apache Spark in Python. PySpark is often used for large-scale data processing and machine learning.
Azure data factory tutorial playlist:
ADF interview question & answer:
1. pyspark introduction | pyspark tutorial for beginners | pyspark tutorial for data engineers:
2. what is dataframe in pyspark | dataframe in azure databricks | pyspark tutorial for data engineer:
3. How to read write csv file in PySpark | Databricks Tutorial | pyspark tutorial for data engineer:
4. Different types of write modes in Dataframe using PySpark | pyspark tutorial for data engineers:
5. read data from parquet file in pyspark | write data to parquet file in pyspark:
6. datatypes in PySpark | pyspark data types | pyspark tutorial for beginners:
7. how to define the schema in pyspark | structtype & structfield in pyspark | Pyspark tutorial:
8. how to read CSV file using PySpark | How to read csv file with schema option in pyspark:
9. read json file in pyspark | read nested json file in pyspark | read multiline json file:
10. add, modify, rename and drop columns in dataframe | withcolumn and withcolumnrename in pyspark:
11. filter in pyspark | how to filter dataframe using like operator | like in pyspark:
12. startswith in pyspark | endswith in pyspark | contains in pyspark | pyspark tutorial:
13. isin in pyspark and not isin in pyspark | in and not in in pyspark | pyspark tutorial:
14. select in PySpark | alias in pyspark | azure Databricks #spark #pyspark #azuredatabricks #azure
15. when in pyspark | otherwise in pyspark | alias in pyspark | case statement in pyspark:
16. Null handling in pySpark DataFrame | isNull function in pyspark | isNotNull function in pyspark:
17. fill() & fillna() functions in PySpark | how to replace null values in pyspark | Azure Databrick:
18. GroupBy function in PySpark | agg function in pyspark | aggregate function in pyspark:
19. count function in pyspark | countDistinct function in pyspark | pyspark tutorial for beginners:
20. orderBy in pyspark | sort in pyspark | difference between orderby and sort in pyspark:
21. distinct and dropduplicates in pyspark | how to remove duplicate in pyspark | pyspark tutorial:
Комментарии