Working with Nested JSON Using Spark | Parsing Nested JSON File in Spark | Hadoop Training Videos #2

preview_player
Показать описание
Working with Nested JSON Using Spark | Parsing Nested JSON Files in Spark | Hadoop Training Videos #2
Hello and welcome to Big Data and Hadoop tutorial series powered by ACADGILD. In the previous session Hadoop tutorial series, we learned about Hive and Spark Integration. In this video, we will learn, how to work with nested JSON using Spark and also learn the process of parsing nested JSON in spark.
As structured data is very much easier to query, in this tutorial we will see an approach to convert nested JSON files which is semi-structured data into a tabular format. Kindly check out the execution part on the video.
Go through the complete video and learn how to work on nested JSON using spark and parsing the nested JSON files in integration and become a data scientist by enrolling the course.
Please like and share the video and kindly give your feedbacks and subscribe the channel for more tutorial videos.
#bigdatatutorials, #hadooptutorials, #nestedjson, #hadoop, #bigdata
For more updates on courses and tips follow us on:
Рекомендации по теме
Комментарии
Автор

Could you please demonstrate how to retain decimal value when we write the data frame in json format. Eg : one of my column in df has a value of 12.00 when I write this df into a json file df.write.json(“user/my path/“) . The json file written in this path will have “column”:12.0 instead of 12.00.

nainularabsm
Автор

One scenario ...suppose I need to create a application where I need to load multiple files (CSV) in data frame, also if the file structure got mismtached from our defined structure then we need to redirect those files into some error folder and load the file only with correct structure..how will we achieve this in spark.
Could you please help

Raghav
Автор

can this be done using spark sql (without using dataframe functions)?

big-bang-movies
Автор

I am facing a problem I want to mask (encrypted) JSON nested values using spark.and scala do anyone have idea or suggestions

pradeepp
visit shbcf.ru