filmov
tv
Understanding and Applying BERT | Bidirectional Encoder Representations from Transformers | NLP | Py
Показать описание
===== Likes: 26 👍: Dislikes: 0 👎: 100.0% : Updated on 01-21-2023 11:57:17 EST =====
BERT is an open source machine learning framework for natural language processing (NLP) developed by the Google AI team. This has lead to state-of-the art technologies that have made significant breakthroughs on common problems such as natural language inference, question answering, sentiment analysis, and text summarization.
I go through the basic theory, architecture, and implemenation and in no time, you will be conversational in this brilliant architecture!
Feel free to support me! Do know that just viewing my content is plenty of support! 😍
Watch Next?
🔗 My Links 🔗
📓 Requirements 🧐
Python Intermediate and or advanced Knowledge
Google Account
⌛ Timeline ⌛
0:00 - BERT Importance
1:05 - BERT Architecture
1:39 - Pre-training Phase MLM and NSP
5:25 - Fine-tuning
6:58 - BERT Code Implmentation CMD or Notebook
9:51 - Create Tokenizer and Important Features
11:45 - Transforming Text to BERT input
12:38 - Fine Tuning Model, Testing, and Predictions
🏷️Tags🏷️:
Machine Learning, BERT, Bidirectional Encoder Representations from Transformers, Statistics, Jupyter notebook, python, Natural, language, processing, NLP, transformer, encoder, google, AI, google AI, tutorial,how to, code, machine, GPU, google colab, github, pretraining, fine tuning, sentiment, twitter, predictions,AUC, MLM, NSP, Masked Language Model, Next Sentence Prediction,
🔔Current Subs🔔:
3,033
BERT is an open source machine learning framework for natural language processing (NLP) developed by the Google AI team. This has lead to state-of-the art technologies that have made significant breakthroughs on common problems such as natural language inference, question answering, sentiment analysis, and text summarization.
I go through the basic theory, architecture, and implemenation and in no time, you will be conversational in this brilliant architecture!
Feel free to support me! Do know that just viewing my content is plenty of support! 😍
Watch Next?
🔗 My Links 🔗
📓 Requirements 🧐
Python Intermediate and or advanced Knowledge
Google Account
⌛ Timeline ⌛
0:00 - BERT Importance
1:05 - BERT Architecture
1:39 - Pre-training Phase MLM and NSP
5:25 - Fine-tuning
6:58 - BERT Code Implmentation CMD or Notebook
9:51 - Create Tokenizer and Important Features
11:45 - Transforming Text to BERT input
12:38 - Fine Tuning Model, Testing, and Predictions
🏷️Tags🏷️:
Machine Learning, BERT, Bidirectional Encoder Representations from Transformers, Statistics, Jupyter notebook, python, Natural, language, processing, NLP, transformer, encoder, google, AI, google AI, tutorial,how to, code, machine, GPU, google colab, github, pretraining, fine tuning, sentiment, twitter, predictions,AUC, MLM, NSP, Masked Language Model, Next Sentence Prediction,
🔔Current Subs🔔:
3,033