18. Copy multiple tables in bulk by using Azure Data Factory

preview_player
Показать описание
In this video, I discussed about Copying multiple tables from SQL Database in bulk to Azure blob storage using Azure Data Factory

Link for Azure Databricks Play list:

Link for Azure Functions Play list:

Link for Azure Basics Play list:

Link for Azure Data factory Play list:

Link for Azure Data Factory Real time Scenarios

Link for Azure LogicApps playlist

#Azure #ADF #AzureDataFactory
Рекомендации по теме
Комментарии
Автор

I've whatched so many classes of him that now I can perfectly understand his English. Thank you bro!

diogodallorto
Автор

Great topic & very helpful for understanding several key activities. Thanks a lot :)

sudipmazumdar
Автор

Thanks so much for this, Saved my week

patrickmuaenah
Автор

Superb explanation and presentation.... I watching your all the videos. 👍👍👍

RahulKumar-jgly
Автор

Hi I have doubt, if some tables get deleted in source db, will it impact in destination. or will it create only the existing tables in source only. Since I am daily delete db and do the copying activity from source as the source gets changed daily basis. please confirm

rathikavenkatesh
Автор

Hi Bro, I wanna do this in reverse way.. Need to copy excel sheets from blog storage to different sql tables.. can you please suggest ?

chandruk
Автор

Hi Maheer, I found your channel looking for ADF help, Great content
here, sequential series one by one for each area of ADF. I have been
continuously watching ADF videos one by one. learning a lot. I have one
question. we have Dataverse folders in Datalake and I want to load those folder based files to Azure SQL database. each folder has csv partitioned by monthly based csv

RKTECH
Автор

Hi i tried this but problem is only one table copied... When I run the PL it's prompting for table name.. But I have parameterized the table .. Pls can u clarify

codeworld
Автор

Suppose I have some 200 tables in on premise sql server and want to migrate some 100 out of those to Azure sql server then how can dynamically create Table in Azure sql server with different table schema, because each table will have different columns so how to dynamically create these tables and copy data?

TechnoSparkBigData
Автор

Awesome...you deserve 80k+ subscribers...👍👍

samrattidke
Автор

How can we use similar approach if we want to copy data from snowflake to azure sql db?

saketsrivastava
Автор

once our pipe line start if any of table throwing error while copy then how to make sure it wont stop others to copy ?

vishalshrivastava
Автор

Have a doubt. Support if any table deleted in source database, how do we handle.

rathikavenkatesh
Автор

How to copy folders and files without changing hierarchy structures, also am not aware what is the depth of hierarchy?

thesujata_m
Автор

Copy data from different multiple sources to ADLS by using single pipeline ?

Offical_PicturePerfect
Автор

How can I also get column headers in csv files. I am able to get all data but first row is not column header

ctipykz
Автор

WHen i click preview data it says enter the value for schema name and table name

ajaiar
Автор

Hello Sir, This is really nice. I have done my pipeline perfectly, Thanks for this. but actually, I would like to ask one more thing. I want to add a trigger after this pipeline that will work if data will update in any tables then it will update in our tables as well but just new data not change past rows. Could you help me for this

souranwaris
Автор

At 11:00 I am getting only the schema parameter while I have created table parameter also.

prajvalsingh
Автор

Scenario1: Copy all table data from SQL DB to Azure BLOB storage. Output file name will be Schema name ‘_’ table name .csv
1. Copy all tables in the Azure SQL DB
2. Put a condition if the table is having only one row or zero row then data will not be copied for that table.
3. Copy data in a BLOB storage path as CSV file

in this scenario how, where and how we put 2nd condition

sudheerjajjarapu