Creating External Table with Spark

preview_player
Показать описание
In this video lecture we will learn how to create external table in hive using apache spark 2. we will also learn how we can identify hive table location. we will use impala shell for checking hive table, in this process we also use invalidate and compute stats commands of impala

Books I Follow:

Apache Spark Books:

Scala Programming:

Hadoop Books:

Hive:

HBase:

Python Books:
Рекомендации по теме
Комментарии
Автор

you are doing great job sir...excwllent plalist...

parthdayala
Автор

Thanks a lot for the excellent playlist :) It is really helpful :) Can you please make some videos on spark-submit and tuning?

SaimanoharBoidapu
Автор

Hi sir, as we are creating ext table here.. what if we want to create ext table on existing data folder location like /user/cloudera/mydata/ and in this directory i have more csv's. In hive we create ext table on existing hdfs location directory. In this example we have just created another location to save DF. There is no External keyword used as we use in Hive/impala

funwithshreeraj
Автор

For external table, we should store data in hbase ...here we didn't create hbase

AmitGupta-nyqy