Azure Data Factory - Partition a large table and create files in ADLS using copy activity

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Just a small help, for all kind of I/O errors /connection timeout errors you can use CONNECTIONTIMEOUT in your DB string, this works for ADF.

payalkalantri
Автор

when it is talking about MPP it is not recommended to have more than 60 partitions

Thegameplay
Автор

How do we get the filename prefix maintained in sick? Like Also, in your example, how did you achieve Can you help telling me about sink details you provided?

Kishyist
Автор

What settings did you used in sink to do multiple writes?

mohitarora
Автор

Thank you for your awesome tutorials, could you provide the sink parameters please as it is not shown in the video please?

scarabic
Автор

Can you please make a video for a source system like db2 where these options won't show up in the copy activity. But if we have a very big table, what will be the better approach for such scenario?

nr
Автор

Very nice explanation however partitions are not working for I/O exception errors in ADF.

payalkalantri