Demystifying Open Source Model Deployment At Hugging Face: Introducing Spaces & Inference Endpoints

preview_player
Показать описание
Lets talk models, specs and inference endpoints. We will understand the spaces and inference endpoints at Hugging face, and discuss about how to choose the hardware for the models one has to deploy.

The features of the spaces and the inference endpoints are compared, and then practical introduction of the model deployments are shared.

The data and the code is located at

The knowledge here can be useful when you are planning to do the deployment in your own hardware too. Knowing that others have done, your team can tune your hardware and install necessary software for the inferences to work.

Hope you like this video, and subscribe to the channel. Further uploads related to Big Data, Large Language models and Artificial Intelligence will be shared to your Youtube Dashboard Directly.

The supporting playlists are
Practical Projects Playlist
Huggingface Playlist
Python Data Engineering Playlist
Python Ecosystem of Libraries
ChatGPT and AI Playlist
AWS and Python AWS Wrangler

PS: Got a question or have a feedback on my content. Get in touch
By leaving a Comment in the video
@twitter Handle is @KQrios
Рекомендации по теме
Комментарии
Автор

Well explained Spaces and Inference Endpoints

prakharjain
Автор

Chad developer has made a comprehensive getting started guide which is better than official docs, thanks!

dmitrypuchkov