BERT 05 - Pretraining And Finetuning

preview_player
Показать описание
In this video, we will learn how to pre-train the BERT model. But what does pre-training mean? Say we have a model, first, we train the model with a huge dataset for a particular task and save the trained model. Now, for a new task, instead of initializing a new model with random weights, we will initialize the model with the weights of our already trained model, (pre-trained model). That is since the model is already trained on a huge dataset, instead of training a new model from scratch for a new task, we use the
pre-trained model, and adjust (fine-tune) its weights according to the new task. This is a type of transfer learning.

Рекомендации по теме
Комментарии
Автор

Stepping up day by day✨ Thank you for sharing the knowledge 😁

yuvarajneelagandan
Автор

Hello bro.
How can I contact you i need your help for machine learning related startup.

mahender
Автор

Watched all your videos in BERT, its good
But the core level practical explanation about how it actually works is not present.
If you could do a video on it, would be great.

karthik