Landing data with Dataflows Gen2 in Microsoft Fabric

preview_player
Показать описание
Pipelines are cool in Microsoft Fabric, but how could we use Dataflows to get data into our Data Warehouse? Patrick shows another way to move your data with just a few clicks!


*******************

Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.


*******************
LET'S CONNECT!
*******************


***Gear***

#MicrosoftFabric #Dataflows #GuyInACube
Рекомендации по теме
Комментарии
Автор

Merge needs a nice search feature so you can quickly filter the columns to find the field you need.

TimothyGraupmann
Автор

Q: wish y’all would start declaring in the description what licensing/capacity is required to replicate whatever it is you’re doing in your videos. Often, I’ll see one of your tutorials, get exited about implementing a version into our own workspace, and then realize it requires premium capacity, PPU, or some other licensing we don’t have (at least not at the moment).

alexanderbarclay
Автор

Great video. 04:20 Pipelines also has an option to append or replace data at the destination.

oguvtmu
Автор

Q: Great video and now I see there’s a general data warehouse option, but then there’s a kql warehouse, and lake house with sql endpoint. What makes sense to architect beyond the bronze layer where a lake house intuitively makes sense? What are the advantages/disadvantages for each of them if a team handles a medallion structure end to end?

betrayedslinky
Автор

Before running a downstream ETL I wanna make sure the upstream ETL tables have complete information for the previous day/hour. How to create dependencies in pipeline so that I am not running ETL on same old data? Thanks!

llgiquz
Автор

I have mentioned that Dataflow does not seem to be able to link to another query with load enabled, (I did drop info using the contact us on your website, ) visually looking at Gen2 Dataflow, seems that Gen2 can do this,

Edit: a PowerBI dataset would be an awesome destination, though I see it doesnt do incremental refresh so seems Gen1 still has a use case

MrSparkefrostie
Автор

Can you trigger a pipeline with an end point URL call. Like you can with Power Automate? There is a trigger that is an HTTP request and it gives you a URL to call to trigger the flow?

noahhadro
Автор

Hi Patrick! Is Analysis Services supported with these? I think it was depreciated recently in the last Dataflows….

ghamper
Автор

this is same as datamart.. what is the difference btw gen2 and datamart??

prakash
Автор

what about upsert? and Merge the information?, It's a good idea delete and load the information all time?

EmmanuelAguilar
Автор

Hi, how can I pass a parameter / variable to a dataflow from the pipeline??

Sevententh
Автор

I see the DataFlows Gen 2 has the ability to make columns a "key" and was hopeful that we would be able to do an upsert type of operation when writing to a target. For example, this could be a huge help when building a dimension, adding an "index" column as a surrogate key, and then doing an upsert into the target. Instead it current just appends all records to the existing records or wipes and replaces.

DavidZebrowitz
Автор

What is better folks df gen 2 or informatica when it comes to transformation?

Babayagaom
Автор

Q: Is dataflow Gen2 uses live connection or import data into power query engine, performs transformations and load back into Lakehouse?

vijayvizzu
Автор

Q:will it have same risk when referenced queries run multiple times?

sergzador
Автор

Hi Patrick @guyinacube can you please make a video about streaming dataflows and streaming datasets in Microsoft Fabric ?

youssefmejri
Автор

Thanks for the beautiful content.
I saw that whenever I load data to data warehouse you delete the data alredy in DW
How is done incremental loading in frabric?

quiosaevaristo
Автор

what happens to all the pipelines and data flows I already have in Azure Data Factory? Can they be migrated?

joseangelmartinez
Автор

Can we put one data destination for all tables that we load in Dataflow gen2? Is it possible??
Please help.

anjalisingh
Автор

Can dataflows gen2 handle high volume data like Spark? Or for large data, should I continue to use Spark?

derekwilliams