Tutorial 4-Build,Train, Deploy Machine Learning Model In AWS SageMaker- Uploading Files In S3 Bucket

preview_player
Показать описание

Please donate if you want to support the channel through GPay UPID,

Please join as a member in my channel to get additional benefits like materials in Data Science, live streaming for Members and many more

Please do subscribe my other channel too

Connect with me here:
Рекомендации по теме
Комментарии
Автор

For a past one week I was literally confused about how to get things done using AWS SageMaker evern after watching AWS Beginner Videos. But right now It's very much clearer than before!! Thank you so much sir!

blue_sapphire
Автор

Hi Krish and all the learners, just an update. For learners like me, who are trying to execute the above code in their jupyter instance, check the sagemaker instance first by command sagemaker.__version__. If it is "2.24.1", then sagemaker.s3_input won't work. Instead you need to write sagemaker.session.s3_input. Thanks for the video Krish.

geekypratyush
Автор

Hi Krish, Thanks for guiding us about TrainingInput instead of s3_input.
s3_input_test = sagemaker.TrainingInput(s3_data='s3://{}/{}/test'.format(bucket_name, prefix), content_type='csv'). This is good content and it helps us with Machine Learning😀

jayasaichandrareddymandala
Автор

For the latest version of Jupyter, you need to rename 'sagemaker.s3_input' to and it should work. So it becomes: s3_input_train = sagemaker.inputs.TrainingInput(s3_data='s3://{}/{}/train'.format(bucket_name, prefix), content_type='csv')

johnnyzhao
Автор

AWS SageMaker Tuitorial are excellent even a person like me like a layman have understood

bhopinderkambo
Автор

Newbie to YT. Instructional videos are so helpful and informative when there is so much to learn. 👍😀

fififinance
Автор

Man, you are such an amazing guy, made things very clear and Thank you very much Krishna! /\, please if possible also make videos on Lambda, Redshift, RDS and

manideepgupta
Автор

Thanks a lot sir really impressed with the teaching and learnt many important concepts in this series

srikaranudeepremani
Автор

Disable your ad-blocker for this channel. So much dedication for free.

tushihahahi
Автор

Everything made easier, crystal clear !

Pysummercamp
Автор

Nice video. I am confused on how the data is actually imported in first place. We are downloading it from some url. But what is that url? How did data end up there? Is that another S3 instance? Can you please explain more on that urllib function and the url that you used to download the data in first place.

durgeshmajeti
Автор

Hi krish, I have a query. Suppose your train data is of 1tb or more than that. In that case how Can I save my train_data datframe directly to s3? Because right now we are saving the train_data dataframe in the currnet instance as train.csv and from there on we are uploading this train.csv to s3 . So how can i avoid this middle step i.e saving the dataframe to instance as csv?

arijit
Автор

Small change required as per the latest release:
s3_input_train = sagemaker.TrainingInput(s3_data='s3://{}/{}/train'.format(bucket_name, prefix), content_type='csv')

ipvikas
Автор

ClientError: An error occurred (AccessDenied) when calling the CreateBucket operation: Access Denied
I'm getting this error. Does anyone have an idea how to solve this error?

RishiKumar-dvck
Автор

could you please make this playlist public. this will help beginners like me to learn a lot. Please consider my request

thalanayarswaminathansaipr
Автор

make video for deep learning model in aws

ArunKumar-sgjf
Автор

sir live session today at what time, excited to meet the mentors!!

ayushgoel
Автор

I have a question: when we create a S3 in order to store data, what is the size of the S3 bucket?

aayushjain
Автор

Hello Krish, I have 6 yrs of experience in finance + Digital marketing, I got to know about AWS from prev organization and I have done the certification, could you please help me how shall I improve my resume to get a good job in AWS?

sujatabehera
Автор

@krish Naik- I am not able to see S3 bucket with bankapplication name.It shows 0 instance.When i tried manually creating s3 bucket with same name it says Bucket with same name already exist.I refresh and logged in mutiple time.Please let me know how to sort the issue

rajatnipane