A Complete Overview of Word Embeddings

preview_player
Показать описание
NLP has seen some big leaps over the last couple of years thanks to word embeddings but what are they? How are they made and how can you use them too?

Let's answer those questions in this video!

▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬▬

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬

#MachineLearning #DeepLearning
Рекомендации по теме
Комментарии
Автор

Thank you. It is very clear and informative, though i really think you (AssemblyAI) should lose the music on the background; it is distracting and it gives the whole thing an infomercial feeling.

ozgurak
Автор

Very annoying to have background music in such videos…

amefill
Автор

Would love a video on ELMo further. Thanks for all this!

impracticaldev
Автор

great video, but the music in the background is driving me crazy. very distracting

BrunoGarofalo-sc
Автор

Great explanation! Thank you! Pls. drop the music for next videos.

Arriyad
Автор

Excellent explanation. I did some study on this topic before coming here and the reason was because so many terms and concepts were quite overwhelming. I generally understood those but still missed the fine tuned clarity. After watching this video, most of what I read before started making a lot of sense. I highly recommend this video. Thank you so much.

manojjoshi
Автор

great explanation. please explain elmo and other approaches. also please make a video about efficient ways of clustering the embeddings👍

sajjaddehghani
Автор

amazing video. Perfectly clear speech, good explanations, logical visualisations and the background music makes it a lot easier to focus. Thank you!!

marten
Автор

great explanation. Please explain ELMO and GloVe. it was really great

hadiloghman
Автор

Awesome overview.. Loved it.. Waiting for videos explaining GloVe and Elmo..

TuhinBhattacharya
Автор

It is incorrect to say that German and Turkish are distinct because they are "morphologically rich languages". They are no more morphologically rich than any other language, roughly speaking. I think what you meant was that German and Turkish are much more agglutinative than many (most?) other modern languages. They have the same number of morphemes, roughly, but the morphemes are bound together in longer bound morphemic sequences (these languages have words consisting of a higher number of morphemes, on average, that are bound together)

andrewstrebkov
Автор

Awesome content but these background music are slightly distracting specially when you play video on 1.5 speed

moeal
Автор

Would have been great without the background music (( Nice video anyway. Thanks

miriamramstudio
Автор

great presentation. Why always the damn loud techno background music ? Can hardly follow what she says ...

nooorm
Автор

I would recommend you to make your videos without the music background. It is quite distracted. Your video is helpful for me to understand this concept but I am trying to be patient with the mushic!!

zilaleizaldin
Автор

Great content thanks. Due to a hearing problem I would appreciate it, if you could remove the backround music. Ok? Thanks

joergbieri
Автор

Very interested in an in depth explanation of ElMo

whifflingtove
Автор

Do transformers from scratch. I heard they can be written in 50 lines. I would like to understand how bert encodes words

lexflow
Автор

Thank you, I want to ask if there are any techniques that use Hidden Markov Models to represent the embeddings?

leo-phiponacci
Автор

Can you please cut the background music? It's not really nice when I play in 1.5x speed

haralc