6. Log Pipeline Executions to SQL Table using Azure Data Factory

preview_player
Показать описание
In this video, I discussed about logging pipeline execution details to SQL Table using Azure Data Factory

Link for Azure Functions Play list:

Link for Azure Basics Play list:

Link for Azure Data factory Play list:

#Azure #ADF #AzureDataFactory
Рекомендации по теме
Комментарии
Автор

Perfect. Logging execution details has become an important part of development. This is certainly helpful. Thanks.

empowerpeoplebytech
Автор

Great walk through 👍👍👍
after watch some of video, my ear becomes more adaptive with your Hindi dialect

snmailist
Автор

create table tbl_adfPipelineExecution
(
adfname varchar(100),
pipelinename varchar(100),
triggername varchar(100),
runid varchar(100),
triggertime varchar(100)
)

select * from tbl_adfPipelineExecution

create procedure dbo.usp_adfpipelineExecu
(
@adfname varchar(20),
@pipelinename varchar(20),
@triggername varchar(20),
@runid varchar(20),
@triggertime varchar(20)
)
as
BEGIN
insert into tbl_adfPipelineExecution values
(
@adfname,
@pipelinename,
@triggername,
@runid,
@triggertime
)
end

😀

barrivikram
Автор

Love you sir lots of thankx Loves from jharkhand Infosys employees

sanatkumar
Автор

Great video.. very informative. Keep up the good work.

vannitecmediastudio
Автор

You are doing such a wonderful video. Can you please add the query details & source files for practice.

mvkr
Автор

Hi sir,
I need to capture the table name, count(how many rows loaded), load date, in one table and start datetime, end datetime in another table in SQL database.
So can we capture those details using above procedure which you have said. Pls help me.
Thanks in advance.

Nishal_Goud
Автор

Hi,
I have a requirement where I need to write to a Snowflake table regarding the number of records processed in each of the paths of a data flow. The data flow checks multiple business conditions. The Snowflake table will have columns like BusinessRuleName, PassCount, FailCount, ExecutionTime. How do I achieve this through the Azure Data flow? Please note that one single data flow has multiple business rules being checked.

smithamurthy
Автор

Hey.. You are doing a wonderful job. Congratulations.

I have some questions here.
1. Is there a way to get the entire JSON of the output?
2. Other than the system variables, if I have to get number of dataRead, DataWritten, rowsRead, etc. how can I get it without manually entering the code in the value?

I am trying to create one stored proc activity and trying to mimic the same to all the pipelines without changing the parameters(just like how you used the system variables to get the details). I dont want to manually enter the copy activity name in the stored proc activity if I have to do this for about 100 pipelines reading several tables and write all of them in to a single table which will call the same stored proc.

Could you help me find a solution.

Thanks in advance

slazer
Автор

Sir could you pls add video for how to deploy pipline and how to mantain version using azure dev opps.I am truly thankful for ur this effort

shivanidubey
Автор

How to write logs(run time history and error message) from Azure notebook and send to Application Insights?

swagatikatripathy
Автор

is it possible to keep track of pipeline executions, which were run in debug mode OR is it mandatory to have an automatic trigger?

kind regards

hendrik-stack
Автор

I have a requirement if I want to add Unique ID to this table how can I add it?

sachinv
Автор

Is it possible to move failed files to a blob container and then re-process them?

salonishetye
Автор

Is it possible to get pipeline run status?

rajeshkhannapamula
Автор

Could you share the create table and create procedure script

anjanidubey
Автор

If you share the scripts its become easy for viewers for handson experiance

TheANKUSHVIDEOYOUTUB