#102. Azure Data Factory - Handle Lookup with more than 5000 records using Loop

preview_player
Показать описание
Рекомендации по теме
Комментарии
Автор

You saved my job with this video. Can't thank you enough!

baladenmark
Автор

in iteration you have warning it throughs error when I run my pipeline with the same logic.
for example iteration variable @range(1, add(div(int(variables('rowcount')), 1000), 1))

The function 'int' was invoked with a parameter that is not valid. The value cannot be converted to the target type

himanshubharambe
Автор

very complex pipeline. Thanks for this video,
such are the scenarios that we face in real-time.. !

AnandKumar-dcbf
Автор

Superb video mam !
Can we handle same thing when source having File present in blob container ?

avinashkale
Автор

Hi - I am looking for similar solution. But my source file is in data lake not in database.
So can we do the same steps for the file in the data lake as well.

sangeethasreenath
Автор

Thank you, I have an additional scenario to it. Do you have a video where Lookup data is passed as an array into Set variable activity, Finally the array is written into json file. If so, pls let me know. Thx

harinim
Автор

Mam,
can u show 1 scenario with ADF U-SQL activity...

AnandKumar-dcbf
Автор

Thanks a lot !
In my case i have a XML file,
when i do lookup it directly go in error,
i can't have the number of rows because the file is too big to be read,
i don't want to use dataflow it's too expensive for the client

sidneyhodieb
Автор

Nice explanation 👏 Thanks for this real time scenario. Hope it will be useful for me. Let me try thanks again

revathyvijaymukundan
Автор

looks like it's not working. first iteration offset is 0 and fetch next rows is 5000. this is working fine. in the second iteration, offset is 5000 and fetch next rows is 6776. it will fetch 6776 records. it has to fetch next set of 5k right. could you please check and clarify

pavanchakilam
Автор

Hi,
Very Useful Video.
Can you please help me with any solution for this how we can overcome the limit of lookup activity output count if we connect to salesforce service cloud as the soql bulk api query query doesn't support offset function . Is there any function apart from offset we can use for this while connecting to salesforce from adf.

sreelakshmins
Автор

Great scenarios, will be v helpful, thanks so much :)

junaidmalik
Автор

Thanks for the video mam.. very helpful!

Can you also help like how do we do unit test on table rows and columns like to check whether copied table contents are correct or not? Thanks

roshankumargupta
Автор

thank you this is really helpful <3

rafikgyurjyan
Автор

👌 Nice explanation and clear example. And yes, agree with Anand Kumar that this was real world example.
Just one suggestion if I may suggest - that if the variables and activities are named properly, then it would be more easier to understand. Like instead of variable8, variable9, Lookup1 etc something meaningful ... This would also set a great example and help people know how to use the naming conventions in ADF.

reallifevideos
Автор

Hi, I am a huge fan of your videos on ADF. Could you please make same on Azure Synapse and Azure Databricks too? Thanks in advance.

debasreepal
Автор

Hi, Could you please help me to read the 5 MB Json file from Blob Storage through look up activity

praveenkumar-jjor
Автор

That's awesome.. 😊 do keep bringing more real time scenarios like these..

AshwaniAshish
Автор

Hi Mam, Thank you for this video.
I have try this but in my cases it take 1 hours to execute 5000 recorders. please help me how to optimize the time.

kalpanamore
Автор

Hi I saw your videos it's good but
Could you please provide the if conditions the divide varible

Ap-ki-bokka