Capture Changed Data in Azure Data Factory - Handling deletions in Incremental Loads - 3 simple ways

preview_player
Показать описание
In this video, we see 3 different but simple ways to perform incremental loading on Azure Data Factory. We also see how to handle deletes in incremental loads as it is a very common scenario. Copy Activities and Mapping Data Flows on Azure Data Factory are used for the implementation of these examples. You will find the SQL Code on my GitHub account.

Follow me on social media:

00:00 - Intro
01:00 - Setting up the tables & inserting data
01:54 - Upsert using Dataflows
08:15 - Handling Deletions using Dataflows
13:30 - Testing Method 1
16:04 - Method 2 with Copy Activity and DataFlows
25:40 - Method 3 with Copy Activities and Stored Procedures
26:09 - Conclusion
Рекомендации по теме
Комментарии
Автор

Wow! This is best incremental load demo video I can find so far. Thank you!

d.b.
Автор

how do we manage delete? if source table is huge (size =200gb) how do we identify if some of the rows are deleted from it before inserting into target table? I dont want to delete the row in target table but just have a flag against it. The target table is in a DW which should have history of all data. CDC is not an option. Thanks

gauravdevgan
Автор

Can we combine the upsert and deletion in one single data flow?

AlejandraDuarte