Azure Data Factory - Filter and Foreach activities (Part 4)

preview_player
Показать описание
In this video you will learn about the Filter and Foreach activity. Specifically how to use the filter activity to filter down an array and then how to use the foreach activity to iterate over the new output array and perform a series of actions on each item within that list.

Azure Data Factory - Metadata Activity (Part 1)
Azure Data Factory - Stored Procedure Activity (Part 2)
Azure Data Factory - Lookup and If Condition Activities (Part 3)
Рекомендации по теме
Комментарии
Автор

Great format! Extra plus for the interesting board games in the background!

jakobwallgren
Автор

Hi Mitchell,
Nice article, Happy to read.. I jsut wanted to know is there any way to get the all files from folder only based on extension instead of file name. To process all files have some predefined extension set on variable or pipeline parameters. Thanks.. waiting for your reply.

AshishKumar-qrzs
Автор

Hi.. Thanks! Very well explanation and it does help me.

prashishshirsat
Автор

Please let us know where would you placed this stored procedures and files any git reposiory for this

sureshpallapolua
Автор

Thank you Mitchell, you kept it short, liked it

henrybroadcast
Автор

Hi sir, i need help .. How we can add header and its details row in CSV file from database table

sunrathod
Автор

Thank you for the excellent content videos.

davidvinh
Автор

MitchellPearson, videos are excellent and easy learning and expecting from you more vidoes

rammohankallyam
Автор

Very nicely explained 👌 your new subscriber🔔 keep making more videos like this☺️

TheRockAbhi
Автор

Excellent content, Awaiting much more videos on azure data

kadamteja
Автор

Good day Mitchell, this is a nice simple presentation of the way how to do Get-Filter-Store activity. It helped. Thanks a bunch. 👍

sanjeeviraja
Автор

Hi Mitchell, this was a very useful and very nicely explained video. I am asking this question after watching the next part of using for each to copy and delete the files as well. I need to make REST calls inside a for each activity to Yammer API for all the ids present in a large csv file (>10000 rows). 1. Lookup has 5000 row limit, so what is a good strategy to handle large volume of data to feed to the for each loop? 2. If one REST call inside the for each activity (which is a copy with Rest connnector) fails, how do I inspect the current item() value for the failed iteration? 3. Yammer API has a rate limit. How do I make sure that I throttle and adjust the rate if API calls within the for each loop, when I have set the loop to execute in parallel mode? I will greatly appreciate your response and insights on these. Thank you!

SuperUtubeian
Автор

Great video, thanks. How could I export the list of tables within an schema in a database to a csv file or a table using a lookup? Not the content of the tables, thanks again.

mrboloban
Автор

Great video ! Answer every question that I have. Thks

stephanegagnon
Автор

Great video, appreciate for your time and effort. The Azure Data Factory looks very confusion compare to SSIS

MikeBaron
Автор

Very well detailed explanation, thanks!

josericardo
Автор

Great video, I'm learning ADF right now, and this is very similar to what I'm trying to accomplish. We have 4 different types of files coming in daily, so I have 4 different table structures to load. I want to get the names of the files from an ftp server, and based on their names, load them into a staging table on sql server. The file name has info in it which would direct which table it goes to. I wanted to include the actual file name as a column in the sql table that I load, so that I know what data file that info came in on. I'm with you so far through your first 2 videos, I am hoping to go through the next one where you load the files to sql tables.

JeffAffeld