Merging your data in a modern lakehouse data warehouse

preview_player
Показать описание
Learn how you can move your data through different tiers using MERGE either with pySpark or SQL in Azure Synapse Analytics. Stijn Wynants walks you through the different steps.

Stijn Wynants

Table deletes, updates, and merges

Delta Lake Documentation


*******************

Want to take your Power BI skills to the next level? We have training courses available to help you with your journey.


*******************
LET'S CONNECT!
*******************


***Gear***

#AzureSynapse #merge #GuyInACube
Рекомендации по теме
Комментарии
Автор

Great Video. I'd love to see a future one addressing how to handle type 2 SCDs.

craigbryden
Автор

Great video! That's exactly what I was searching for early.
Thanks a lot!

arthurcsp
Автор

Thank you for the video, this is amazing! I've been following this playlist of "Building a modern data lakehouse in Azure Synapse". Up to to this video demonstrates Inserts and Updates, but what about Deletes from the Bronze layer to Silver Layer? Can you please make a video about deletes in Synapse SQL, thanks.

BooranJohnsonRishka
Автор

Great vid, Thanks guys!

Let me ask you, is there any difference in the cost of execution when you run a notebook using PySpark or SQL?

gabrielmorais
Автор

Hi Stijn - how do you keep running the merge statement so that you get live data in your SILVER layer?

LandscapeInMotion
Автор

Question?. When my data merge with new data I need to re-build the data base in the workspace?, or the data base in the workspace read the delta file and have the changes?, please help me

EmmanuelAguilar
Автор

It's not clear how this tie up to the first time data load to bronze.

bunnihilator
Автор

Why delta is not supported in azure synapse dedicated sequel pool?

piniki