10. Log Pipeline Executions to file Using Mapping Data Flows in Azure Data Factory

preview_player
Показать описание
In this video, I discussed about Logging Pipeline Executions to File using Mapping data flows in Azure Data Factory

Link for Azure Databricks Play list:

Link for Azure Functions Play list:

Link for Azure Basics Play list:

Link for Azure Data factory Play list:

Link for Azure Data Factory Real time Scenarios

Link for Azure LogicApps playlist

#Azure #ADF #AzureDataFactory
Рекомендации по теме
Комментарии
Автор

Thanks for creating these wonderful realtime usage of ADF, highly appreciated for the clear presentation and waiting for more relevant videos.

sagar_patro
Автор

Nice video, explaining with simple step by step....
Great job bruh.... 👍👍👍👍

snmailist
Автор

That s a awesome explanation. Thanks. My question is, in this scenario we always hardcode status as 'success'. But what if the activity in the pipeline fails?

ArabaEfsanesi
Автор

Can you please add source table table name, data read, data right, data insert, update, delete, io kind of information in audit log..It would be great..

himanshutrivedi
Автор

Can you please help me with the below question
What is the difference between Dataflow expression and Pipeline expression in Azure Data Factory ?

namangupta
Автор

Why the status parameter is always Success you are putting @wafa

battulasuresh
Автор

The scenario which you should is working for that particular pipeline. It is not logging the values if we run other pipelines from ADF. could you please let me know why it is so??

salikkazi
Автор

Can we change the location we put after creating a data factory?

bhawnabedi
Автор

@WafaStudies How can we add status of pipeline run dynamically to the audit file?

mazhar
Автор

Hai bro my scenario is all retailers data uploaded to app service from there data should be uploaded to blob is it possible

saikumar-itgl
welcome to shbcf.ru