Get Data Into Databricks - Simple ETL Pipeline

preview_player
Показать описание
In this short instructional video, you will learn how to get data from cloud storage and build a simple ETL pipeline

Get started with a Free Trial!

Рекомендации по теме
Комментарии
Автор

Solid demo for an intro to data engineering !

nicky_rads
Автор

Very clear and quick tutorial. Well done, thanks!

julius
Автор

Can u provide us the data file or source for practice shown in this video?

rabish
Автор

Hi, where I can get this code that you are showing here?

ongbak
Автор

Thanks for the demo. Do you all have a link to the slide deck and the data set please?

sumantra_sarkar
Автор

You have not append any meta data with the bronze layer, like when it was ingested, which file is the source of it?
bronze layer should have all historical data, no?
and what should be done next at the silver layer, so that only unprocessed data is processed to the silver table?

TheDataArchitect
Автор

Hello for the video, it could't follow it up, because of the juniper notebook, what do you recommend me to follow in order to replicate what you did in this vidoe. Thank you.

esteban-alvino
Автор

strange, 1/ Bronze: Loading data from blob storage, and path is from S3? am i missing something here?

LearnWithDummy
Автор

In the video orders/spend information data is exported as csv files. Should source OLTP systems export data? Is it more practical than the other methods(jdbc, etc...) ?

omer_f_ist
Автор

What about on prem data and iot data? Does DBX has ingestion capabilities?

UntouchedPerspectives
Автор

Nice. Is the notebook available to download and try?

tpvbghh
Автор

Is this the recommended way of doing ETL with databricks? I thought delta live tables where the recommended approach now

effrey
Автор

So what is the challenge here, because this is like a 12 year old person can set up, basically just organizing some tasks in sequential order.

peterko