Bulk Copy from SQL DB to Data Lake Parquet using Azure Data Factory [ADF]

preview_player
Показать описание
This video is a step by step demo of how to copy bulk a full schema using a lookup activity and aa for each activity to load SQL tables into Azure Storage (Data Lake) parquet files. #datafactory #sqlserver #parquet #datalake #azure #azurestorage
Рекомендации по теме
Комментарии
Автор

Thankkkks, you helped me a lot! In my first job as a data engineer! Greetings from Mendoza, Arg

EmilianoEmanuelSosa
Автор

Saudações brasileiras!! Very good video my friend. This is the type of content that I'm looking lately. Keep going!!

tzexs
Автор

Excellent tutorial! Made my job much easier.

CurlyHairedKiddHD
Автор

How often do you copy your tables, and what is the pricing per gb processed (aprox)?

diegovaras
Автор

Is there a need to optimize the replication so you dont have to copy the whole table each time? Would it be worthy money wise to do so?

diegovaras
Автор

This is very helpful. Thank You Very much
Have you extended this exercise to create partition-based parquet file inside table folder.?

HimenSuthar
Автор

Thank you very match! It was very helpful!

marcosadlercreutz
Автор

Hi,

Which resources did you create under your resource group?

Thanks

KeotshepileMosito
Автор

Can you give me a good reason why you would need to copy data from a sql db to Azure data lake?

ajayigeorge
Автор

Full of confusion, I think better to get advanced preparations so that concentrations should not be broken during focusing on the presentations.

atlanticoceanvoyagebird
visit shbcf.ru