filmov
tv
BERT: transfer learning for NLP
Показать описание
In this video we present BERT, which is a transformer-based language model. BERT is pre-trained in a self-supervised manner on a large corpus. After that, we can use transfer learning and fine-tune the model for new tasks and obtain good performance even with a limited annotated dataset for the specific task that we would like solve (e.g., a text classification task).
Комментарии