Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP)

preview_player
Показать описание
Since its introduction in 2018, the BERT machine learning model has continued to perform well in a lot of language tasks. This video is a gentle introduction into some of the tasks that BERT can handle (in search engines, for example). The first 3 minutes goes over the some of its applications. Then the video discusses how the model works at a high level (and how you may use it to build a semantic search engine which is sensitive to the meanings of queries and results).

Introduction (0:00)
You have used BERT (applications) (0:25)
How BERT works (2:52)
Building a search engine (4:30)

------

The Illustrated BERT

BERT Paper:

Understanding searches better than ever before

Google: BERT now used on almost every English query

------

More videos by Jay:

Explainable AI Cheat Sheet - Five Key Categories

The Narrated Transformer Language Model

Jay's Visual Intro to AI

How GPT-3 Works - Easily Explained with Animations
Рекомендации по теме
Комментарии
Автор

Your videos are so enjoyable! Thanks Jay for this wonderful intro to BERT.

madhubagroy
Автор

Thank you Jay.
This is an excellent start to understand briefly about BERT.

vinayakbaddi
Автор

Although we mainly see decoder-only models nowadays with LLMs, this is still super helpful, thanks Jay!

Fussfackel
Автор

Excellent video Jay! You are a natural educator.

womeninairobotics
Автор

It's a pleasure to watch your video. God bless you.

asheeshmathur
Автор

I admire you. you are the best, always feel I learn something hard in a easy way

anishjain
Автор

Awesome Video... I love how you explain things... Hats off 👍👍

owaisahmad
Автор

Great Explanation Jay!! Truly an awesome one!!

saibhargav
Автор

Great video, I hope you make videos on practical implementations of bert soon.

alankarshukla
Автор

Great. You're so nice. Well explained. So easy to understand.

LearnAboutBlockchain
Автор

Great video, thanks! Do you have any idea which similarity score tends to perform the best when computing the similarities between the sentence embeddings? I suppose cosine similarity, but are there some more exotique ones that you can recommend?

AliS-thck
Автор

Great are you going to explain coding for BERT

sasna
Автор

Can I ask you which program you use to make your graphics? They look nice!

Edit: I found out it's keynote! (In case anyone was wondering).
In that case, would you be willing to make a video on how you make your blog posts? They are amazing and you have a real gift at explaining concepts! It's a bit meta, but I think it would be really interesting.

hdubbs
Автор

Great!

I have a doubt: Why should you only look to the first column?

evertonlimaaleixo
Автор

I got lost, when you said the article is represented as a vector or vectors at 4:34. How is it creating only one or 3 vectors from an entire article?

CheeseCakes
Автор

How other embeddings are created from search at 5.30

krishnachauhan
Автор

4:21 How does the [CLS] became a representation of entire sentence??? And what about the other word embeddings?

Автор

Awesome .please make more video nd blog on new algorithm.How bert create vector of size 768 ?? I have doubt on that

prakashkafle
Автор

I'm betting BERT even finds listening to your voice tedious with all of the high frequencies CUT OFF. I don't know who started this trend, and I don't care. It's moronic and it sounds AWFUL. Human speech has most of it's differentiation around 4kHz. When you cut most of that band out, it's harder to understand and tiring to listen to. STOP already. Between that and half the video being taken up by 'Who has used BERT'... Sorry.. downvoted.

chrisw