How to perform ETL Incremental Data Load using DLT | Data Load Tool | ETL | Python

preview_player
Показать описание
In this video we will continue with the data load tool (dlt) library. We will explore how to perform incremental data load using dlt. The incremental data load in ETL (Extract, Transform and Load) is the act of loading only new or changed data. With this approach we process minimal data, use fewer resources and therefore less time. DLT refers to this as the merge write disposition.
We keep the lastest snapshot of a recrod in the data warehouse. We update and/or insert a new record in the dimension table. This is referred to as upsert.

Link to Channel's site:
--------------------------------------------------------------

💥Subscribe to our channel:

📌 Links
-----------------------------------------
Follow me on social media!

-----------------------------------------

#ETL #incremental #dlt

Topics in this video (click to jump around):
==================================
0:00 - Introduction to data load tool (dlt) incremental load
0:42 - Source Change Detection: Merge Write Disposition
1:38 - How Merge Write Disposition works
2:08 - Source SQL Server DB setup
2:33 - DLT Incremental Load Function
3:27 - Test Incremental Load Function
5:03 - Update/Insert records in Source SQL DB
5:29 - Run the dlt pipeline
5:36 - Review pipelines results
6:00 - Coming Soon
Рекомендации по теме
Комментарии
Автор

Namaste Haq !!! Thank you so much for making this video!, and also sharing your repo, I'm bit confused how you build the connection string. would you mind to share it? UI had checked you Connect to SQL Server with Python notebook also, but didn't realize what's is not correct on my

gustavoleo