Scenario-10: Copy multiple files from blob storage to azure Sql database

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Need to copy multiple files from ADLS to multiple tables in Azure SQL Server...:)

ranjansrivastava
Автор

How can we implement increment load when we have newest file and the records will be updated in sql table, they will be continous of the old data imported, this is my use case.

KaraokeVN
Автор

Great video. What if I want to delete all records from Table first and write new data. Also, I have 13 files with large amount of data. Some are around 100 mb, it works perfectly fine until 6 files but later pipeline get stuck at 500, 000 records. Any suggestion ?

nadeemrajabali
Автор

I have 11 collections in my azure cosmos mongo db, and I have 11 json files in my Azure Blob Storage Container. I'm using Data Factory copy to copy json files from blob to mongodb api. Here I'm able to copy only one file to one collection. I need to copy similarly all jsons to collections. How to copy multiple json files to multiple collections using data factory

gopavarammohankumar
Автор

When I pass parameter in dataset level automatically get metadata activity is asking a value
Please help how to fix it

sravanthiyethapu
Автор

instead of sql server we can import from my sql workbench is it possible or not ?

danishthev-log