Data Flow in Pega

preview_player
Показать описание
Data flow is a rule type in Pega that can be used to ingest, process, and move data from one or more sources to one or more destinations.
Data flows are scalable and resilient, making them ideal for processing large volumes of data.
Each data flow consists of components that transform data in the pipeline and enrich data processing with event strategies, strategies, and text analysis.

Please let me know if you have doubts, I strongly suggest you replay this content for better understanding. Practice well and happy learning!

As requested, please click the below link to download the excel file.

Kindly let me know if you face anu issue.

#CodeWithR #Pega #PegaVideos #Dataflow #dataset #dataflowinpega #CodeWithRavoof #Ravoof
Рекомендации по теме
Комментарии
Автор

It was a great explanation with great effort. Thank you, bro.
Please keep on posting more POC.

hyd_explorer_
Автор

Great Work. I appreciate your effort. Please continue the good work. This is very helpful for newbies like me. Thank you

ActionbeansSoftware
Автор

Great Explanation, eagerly waiting for a detailed video on Data flows.

karangoyal
Автор

Great. Keep posting and helping us learn. 🎉

kaleenbhaiya
Автор

If there are millions of records to be processed and few are blob data and can it be done without RD and call it from activity using browser records?

nivetharajendran
Автор

Can we add a sequence number (Sno) like 1, 2, 3, .. etc in destination table if so how without data integrity issue ?

dharanitharanravindran
Автор

how shold be the implementation if table has 1 million records

vivekkale
Автор

Hi bro, What is abstract source when can we use that

everything
Автор

Can we use data flows to delete from multiple tables together

nithinvalat
Автор

what is the fundamental differnece between dataset and report definition?
why data set is faster?

vivekkale
welcome to shbcf.ru