tf serving tutorial | tensorflow serving tutorial | Deep Learning Tutorial 48 (Tensorflow, Python)

preview_player
Показать описание
Are you using flask or Fast API to serve your machine learning models? tf serving is a tool that allows you to bring up a model server with single command. It also allows to do model version management, loading of models dynamically. It supports features such as version labels, configurable version policy etc. In this video, I will explain you everything in a very easy language.

⭐️ Timestamps ⭐️
00:00 Introduction
00:24 What problem tf serving solves?
04:44 tf serving installation
09:23 tf serving using model_base_path
14:05 serve different versions using model config file
15:35 version labels

🔖Hashtags🔖

#tfservingexample #tfservingdockerfile #tfservingvsflask #tfservingmodel #tfserving #tfservingdeeplearning #tfservingmodeldeeplearning #deeplearningtfserving #deeplearningtfservingmodel #tensorflowservingtutorial

#️⃣ Social Media #️⃣

❗❗ DISCLAIMER: All opinions expressed in this video are of my own and not that of my employers'.
Рекомендации по теме
Комментарии
Автор

Thanks a ton to the instructor!! Very straightforward and get-to-point tutorial for TensorFlow serving for beginners! Highly recommended!!!

jackycwwang
Автор

The best channel i have ever seen 🙏 thanks for your wonderful content and teaching is excellent

KiranKumar-qexm
Автор

me : hey google: which is best channel to learn deep learning?
google: code basics it is.

punnarahul
Автор

Holy smokes, exactly what I needed, no joke! 🤯

You're the best, thanks for the awesome material 😎👍

dec
Автор

I am not expecting this surprise you are awesome👍
If possible can you make videos on tensorRt, and torchserving as well.!!

jaguar_akki
Автор

Hi sir, is it important to keep my files in c drive to make it work? My files were in D: drive and tried to follow the instructions but after running the docker run command I couldn't able to see my saved models or any files inside that folder but when I shifted all the code into the c drive, it starts to work.

bholaprasad
Автор

Getting 'docker' is not recognized as an internal or external command,
operable program or batch file when trying to install tfserving. does anyone has idea why i am getting this error

srihitharavu
Автор

Sir can we use this method to deploy the models on heroku? And since we are using tensorflow serving, would it still mean we need to include tensorflow library in the requirements.txt file (it is very large in size and slows down the whole system on heroku).

kanakmittal
Автор

Thanks for your tutorial. I configure as per your flow. When i use localhost its working. But instead of localhost when i use public ip its not working. Can you give some suggestions. Am using ubuntu server.

chandruv
Автор

Sir can you please make a video series on Docker? I just love the way you teach, it feels I am making permanent memories in my mind

programmingsolutions
Автор

Thanks a lot Bhai 👍.


Simple and best explanation ever

shrikanthpv
Автор

How about the exported Bert processor? Does it need to be served like the Bert model? Besides, is it only output the probability of the model? How about other responses?

pwchan
Автор

I would like to ask about inference, what if the inference was an image? how to send it? or how do I preprocess the image first before inference?

AnimalLore_YT
Автор

how to bind the tensorflow serving rest api to 0.0.0.0, by default it is binding to 127.0.0.1 so deploying to the cloud is problematic

shirsendu_bairagi
Автор

Forgets this and using Flask and Fast API, we are looking for customizable Api serving our input and output, this is just temporary web serving and blind for changing thing 😅

voldemore
Автор

localhost refused to connect is the error i am getting. Before that everything was exactly as you mentioned

vishwarathtomar
Автор

Tensorflow _text is not working or no module declared

nkugnzh
Автор

what sir how much time you are spending, so hardworking person you are

deepakharshaprogaming
Автор

thanks for the content, my question is if I have made the saved_model to accept a image as tf_example, then how to send the request

cadandprogramminguniverse
Автор

Can we install docker on Windows 8? As i am not getting any option for the same.

sagarnarula