Microsoft Fabric: Ingesting 5GB into a Bronze Lakehouse using Data Factory - Part 3

preview_player
Показать описание
Microsoft Fabric End to End Demo - Part 3 - Ingesting 5GB into a Bronze Lakehouse using Data Factory. In this video we'll see how we can quickly ingest ~5GB of data from an unauthenticated HTTP data source into #OneLake using #DataFactory in #MicrosoftFabric. We'll see the distinction between Tables and Files in a Fabric #Lakehouse and look at how we can preview data in the Lakehouse explorer.

00:00 Intro
00:15 Dataset recap
01:25 Workspace and pipeline artifacts
01:57 Pipeline UI layout
02:21 Copy data activity options
03:07 Configure copy data activity source
05:00 Configure copy data activity destination
06:21 Add dynamic content for destination Lakehouse filepath
08:21 Copy Data activity additional settings
09:00 Manually trigger pipeline
09:21 Alternative parameterized pipeline
11:38 Reviewing pipeline run details
12:10 Default workspace artifacts
13:04 Viewing Lakehouse Files
13:46 Roundup and outro

Useful links:

Series contents:

If you want to learn more about Fabric, take a look at some of our other content:

#Microsoft #PowerBI #MicrosoftFabric #lakehouse #datalake #onelake #data #datafactory #datapipline #ai #analytics #medallion #bronze #silver #gold #datafactory #projectplanning #hmrc #dataingestion
Рекомендации по теме
Комментарии
Автор

Thank you for watching, if you enjoyed this episode, please hit like 👍subscribe, and turn notifications on 🔔it helps us more than you know. 🙏

endjin
Автор

Love this series! How many more episodes can we expect? :-)

oskarlindberg
Автор

Hi
My question is that while ingesting the data at 08:10 although the file name ends with .csv but the file format mentioned there is binary. Would be get binary files in the Data Lake or Binary?

Abdullahakbarshafi
Автор

Great series, what the naming convention you are using in the full version of the solution ? I noticed the LH is prefixed with HPA

applicitaaccount
Автор

Why you store the data on Bronze layer in the files folder and dont populate in the tables instead? Or do the both?

Where i'm working currently, we ingest to the table, not to a file.

GabrielSantos-quso
Автор

Do you mean loading data for Bronze layer into Bronze_Lakehouse and then transform the data and load to new lakehouse Silver_Lakehouse? do we need multiple lakehouses or we can manage in single lakehouse?

moneshsutar
Автор

Hi. Great video - however i'm stumbling on the first bloc. I noticed your using basic authentication - and anonymous doesnt work. Can you provide further instruction here please?

sbining
Автор

so clean and easy to follow. can i get a copy of your slides please?

robmays
Автор

grrr trying find the url you are using for price paid data?

robmays