BERT 01 - Introduction

preview_player
Показать описание
Bidirectional Encoder Representations from Transformers (BERT) has revolutionized the world of natural language processing (NLP) with promising results.

We will begin the playlist by understanding what BERT is and how it differs from the other embedding models.

#nlp #bert #datascience #machinelearning #deeplearning
Рекомендации по теме
Комментарии
Автор

Superb need more like this in Datascience techniques

Yohasree
Автор

nice video, do more videos, thanks for sharing knowledge

RAZZKIRAN
Автор

I have seen Bert Videos and read articles i never understood .. u made it so simple and clear .. now I am seeing other videos on Bert and transformers it like a cake walk .. thanks Balaji great work ❤

terryterry
Автор

Thnk u for making this topic easy for me i have to submit thesis on this

anam
Автор

bhai...bas thank you bhai bass thank you..

themadridstar
Автор

hello sir can i use transformer architecture(BERT) for finance fraud detection or credit card fraud cdetection

sreebvmcreation