65. Flatten Transformation in Mapping Data Flow in Azure Data Factory

preview_player
Показать описание
In this video, I discussed about Flatten Transformation in Mapping Data Flow in Azure Data Factory

Link for Azure Functions Play list:

Link for Azure Basics Play list:

Link for Azure Data factory Play list:

#Azure #ADF #AzureDataFactory
Рекомендации по теме
Комментарии
Автор

Excellent explanation bro !!! Finally I have found very good tutor 😊

mandavillivamsi
Автор

Just adding comment for people who has come in comments section looking for solution for multiple nesting in same file which is not shown in this video. There is new update by Microsoft where using one flatten activity you can unroll multiple arrays. I see when this video was created, the option was not available. @Wafa, great video. May be create one with new functionality for multiple unrolls.

aqshatagade
Автор

Simplest explanation and demo. Keep up the good work bro.

Rafian
Автор

Thank you. Very helpful to see a live demo of the denormalization of a JSON (complex data type).

sashafroyland
Автор

If you're going to use powerpoint, in your primary content slides use more graphics and align a topic point title with them, they don't have to be so verbose. Your demos are great! Great content and discussion. Thanks for sharing.

francisfriel
Автор

This is a great video, thank you. How do you flatten multiple files using a Dynamic Unroll by expression? All the files have "value" as the complex type with the nested columns

InathiM
Автор

Nice video. Is it possible to have the source as an output of the webservice?

kitsabee
Автор

thanks for showing us thi trick cheers

AI-Health-posts
Автор

here address and contact columns are in complex ,
here there are subcolumns for address and contact. how can we see thesee details in csv format?

sravankumar
Автор

Hi Maheer, i tried by impoting json script into blob container and have created source stream in dataflow.While previewing the data it is showing nulls in column.Can you please advise me what might be the issue ?

purushothamnaidu
Автор

Hi Wafa, I am using This($$) to flatten a complex XML file using rule based mapping, I have also ticked deep column traver to include all sub columns. Now the issue is I have so many duplicate subcolumn name therefore an error. Is there a way I can add Hierarchy Level name to sub column name using this($$) function. or any other data flow step I should use?

nadeemrajabali
Автор

Good you please tell me how to flatten multiple levels of nested json?. I tried using sequence of flatten transformation, but it keeps on running in bebug mode and fails

vikasswain
Автор

Hi I have use rule base mapping in flatten activity to flatten XML and choose the settings deep column traver it has mapped all the columns but which are of array data type is not populating correctly.So i have used another flatten to unroll that array but in my xml have more than 30 array and i am not getting correct data. Is I have to use 30 flatten for each array to unroll? Can you please help me is there any other way to handle this easily?

rohanchawla
Автор

Can you please put video on XML transformation using ADF

premgcp
Автор

Is it possible to get the array position too? for example: for 1st row, 1 .NET, 2nd row 2 Azure ?

vinaysingh
Автор

i need to split array and i wana select the 4th element in the array how can i do it ?

chandandacchufan
Автор

Hi I have below issue in SINK step.

Source (JSON) ==> Flatten ==> SINK (Tried CSV and SQL DB)

The following column(s) have a complex structure which can only be written to JSON, AVRO, and Cosmos DB datasets: 'Address', 'Contact'. Please remove the columns or update the sink dataset to JSON, AVRO, or Cosmos DB.

harikonka