Mastering HuggingFace Transformers: Step-By-Step Guide to Model Finetuning & Inference Pipeline

preview_player
Показать описание
AI models are built with help of Neural Networks, which are implemented in frameworks like Pytorch/ TensorFlow. Transformer library abstracts away the repetetive aspects of creating the model architecture, and provides a easier way of creating, loading and inferencing the results from the AI models. Video introduces the transformers modules overview.

The data and the code is located at

Hope you like this video, and subscribe to the channel. Further uploads related to Big Data, Large Language models and Artificial Intelligence will be shared to your Youtube Dashboard Directly.

The supporting playlists are
Practical Projects Playlist
Huggingface Playlist
Python Data Engineering Playlist
Python Ecosystem of Libraries
ChatGPT and AI Playlist
AWS and Python AWS Wrangler

PS: Got a question or have a feedback on my content. Get in touch
By leaving a Comment in the video
@twitter Handle is @KQrios
Рекомендации по теме
Комментарии
Автор

Well explained difference between using HF model directly vs configuring/ tweeking HF model as per the requirement 21:40

prakharjain