Fine Tuning Transformer (BERT) for Customer Review Prediction | NLP | HuggingFace | Machine Learning

preview_player
Показать описание
🔥🐍 Checkout the MASSIVELY UPGRADED 2nd Edition of my Book (with 1300+ pages of Dense Python Knowledge) Covering 350+ Python 🐍 Core concepts

---------------------

======================

You can find me here:

**********************************************

**********************************************

Other Playlist you might like 👇

#NLP #machinelearning #datascience #textprocessing #kaggle #tensorflow #pytorch #deeplearning #deeplearningai #100daysofmlcode #pythonprogramming #100DaysOfMLCode
Рекомендации по теме
Комментарии
Автор

Great content. Just a suggestion, whenever you are writing a tensor for the first time, try to write the dimension of the tensor as a comment. If the dimension gets updated (for example by taking argmax) write the updated dimension on the side.

thegreatlazydazz
Автор

Thanks for the tutorial. For the cross-entropy requirements, we needed to use the range 0-4 instead of 1-5, could not we use just one line of code for mapping instead of a multiline function? I mean this line: df['stars'] = df['stars'] - 1

One more question, do we need to balance the size of the classes?

orkhanamrullayev
Автор

Can this be considered a regression problem ? Since, there are reviews which are numerical in nature ?

sohambasu
Автор

How to fine tune bert for roman urdu datset?

rashdakhanzada
Автор

Could you achive this just using HuggingFace Trainer Api instead of doing it all manually via PyTorch?

compeng
Автор

Thanks for the nice educative videos!!

kuberchaurasiya