AWS Read CSV file data from S3 via Lambda function and put into DynamoDB

preview_player
Показать описание
I am going to demonstrate the following stuff -
1. How to read S3 csv files content on lambda function.
2. How to integrate S3 with lambda function and trigger lambda function for every S3 put events.
3. How to put content on DynamoDB via Lambda function

Рекомендации по теме
Комментарии
Автор

best, most practical tutorial I have found. Thank you.

jordanfine
Автор

Thanks a lot... You made it so easy to understand it explain further😁🤘👍👍

ShahrukhKhan-nkqg
Автор

It's a great video. Very helpful and detailed. Thank you!

jorgel
Автор

You have hardcoded the column names, as there are only 3 values. What if there are 100's off columns, how will you dynamically upload in Item object. Can you please explain me that.

bharathreddy
Автор

I need to do this but using nodejs, has anyone seen an example?

caamrb
Автор

So do we need to keep the file name same everytime we upload in S3 ??

engineerhoon
Автор

I have a CSV file with a text column with multiple commas init say something like name1, name2. When I try to import the CSV, the columns in splitting into multiple columns. Name1 into one column and Name2 into another column. Can you please help?

upendrad
Автор

while executing lambda function i am getting this error " 'utf-8' codec can't decode byte 0x82 in position 16: invalid start byte "

please help!!!

dineshvijay
Автор

good content. I have multiple csv files in s3 bucket folder uploaded from hdfs with header but columns are same. How do we import this to dynamo ? csv file name in s3 will not be same everytime.

jnanakshatriya
Автор

will this work if we have multiple buckets ?

quamaraziz