Spark Interview Question | Scenario Based | Multi Delimiter | Using Spark with Scala | LearntoSpark

preview_player
Показать описание
In this Video, we will learn how to handle the multi-delimited file using Spark with Scala. We already saw this scenario being handled by PySpark. Hope this video will be useful for Spark Interview preparation.

Blog link to learn more on Spark:

Linkedin profile:

FB page:

Github:
Рекомендации по теме
Комментарии
Автор

It's really very much helpful. Keep doing these scenario based spark scala interview questions. Thanks

deepakkumarmuduli
Автор

spark 3.0 supports multi delimiters...version prior to 3 support only single charcater as delimiter

vigneshwarank
Автор

Hi Azarudeen. I have done this example but when I split based on \t, for me in the output of dataframe \t as a string is getting displayed. But for you, I could see that in the output of dataframe \t was not displaying as string. Could you please help me understand the difference.

heenagirdher
Автор

U could have directly used ~| as deliminator in options itself na, why did you first changed it to tab then used tab as deliminator in options

sachinlidhu
Автор

Can please provide the like how you handdled in Pyspark

neetusinghthakur