2. Get File Names from Source Folder Dynamically in Azure Data Factory

preview_player
Показать описание
In this video, I discussed about Getting File Names Dynamically from Source folder in Azure Data Factory

Link for Azure Functions Play list:

Link for Azure Basics Play list:

Link for Azure Data factory Play list:

#Azure #ADF #AzureDataFactory
Рекомендации по теме
Комментарии
Автор

thank you sir, youre legend, respect from brazil

rbor-xbeg
Автор

Expertly done!! You explained this perfectly for me. Thank you for sharing your expertise.

GunNut
Автор

Nice videos, Clear Steps.. Please keep uploading. Thank You !

damayantibhuyan
Автор

Excellent content !! Thanks Mate for taking out your time. Could you pls do a series on Data Factory DevOps integration. Building a CI / CD pipeline using library variables

debasisroy
Автор

i have one dought you just copied all the files sales data folder to slaesdataoutput folder only coply activity is sufficent right for my assumption copy activity source we can give path will copy all the files under salesdata sinkpath is that it will place all the files in output folder what was the purpose of getmetadata and foreach activity

vijaypodili
Автор

Very good content, practical scenarios are helpful

rakeshupadhyay
Автор

Thank you for great explanation. Please could we expect learning videos on Azure Synapse Analytics?

deepaksahirwar
Автор

Can you please create a video, how to upload multiple Excel data in Sql Server using Data Flows and please also used data conversion. It doesn't seem to be as easy as we do in SSIS.

subhashkomy
Автор

Very good explanation👌 . Its very helpful

aruntejvadthya
Автор

amazing plz add some incremental load handling data & how to check whether files are present in blob or not from validation or getmetda actitiy.

anujgupta-lcmd
Автор

If possible try a video on crating Global parameters and pass the values dynamically with different DataBases

YanadireddyBonamukkala
Автор

What a content boss.. Really very impressive.. May I know which videos should I refer to get started with Azure Cloud as I am relatively new to this.. I know MSBI and wants to get upgrade myself to Azure Cloud.. Kindly suggest and Your Contents are awesome.. Hats off to You.. 🤟👏

siddheshamrutkar
Автор

Very good explanation. Thank you for this video!

tipums
Автор

Very usefull videos pls make data bricks videos also

sirisiri
Автор

Hi Maheer, Thanks for the detailed explanation. For this topic the scenario should be "Read Files from Source Folder Dynamically in Azure Data Factory" instead Get File Names.. We are not reading/getting "filenames" right, the files were being just copied from source to Sink.?

venukumar
Автор

Hi Maheer, do you video, where we copy csv file from dynamic folder in adls to new folder in adls and store it as parquet.

esrasultan
Автор

Nice work!
Could you please make a video on. How to check 0Kb csv files / zero row record from source. If zero Kb file/Zero records in source trigger an email, in azure Data Factory.
Thanks in advance.

sandeep
Автор

Good one, very helpful and practical scenario. You made it exactly as it is needed!

empowerpeoplebytech
Автор

Can we pass file path dynamically? I have sql table from there I can take the file path. This file path needs to be passed to getmetadata and list the files.

Looking for your help. Thank you so much!

pradeepert
Автор

what if one wants to to a similar things but with .txt or .sql files ( stored in a ADLS Gen 2 container ) ?

lib
join shbcf.ru