Lecture 5.5 ELMo, Word2Vec

preview_player
Показать описание

Рекомендации по теме
Комментарии
Автор

Great video, this explains it perfectly

itznukeey
Автор

Thank you. Very good explanation of ELMo

abhishektyagi
Автор

I have question: I want to do something like this:

"Fire" + "Mountain" --> "Volcano"
"Volcano" --> "Fire", "Mountain", "Lava"
"Fire" + "Steel" + "Building" --> "Forge"

Is this possible with technology discussed in your video? Does it come down to the corpus I train with? Can I extend those models with semantic lexica like WordNet to get other relations of words then similarities? <3

seventfour
Автор

word "compare" is one-hot (shape 1000*1), suppose [0, 0, 1, ...0, 0] (the third element is 1), so, the embedding matrix is 300*1000 to do matrix dot product (300*1000 * 1000*1 = 300*1). Since most of the elements are 0, how to get a meaningful embedding matrix? (0 product everything is 0)

junderfitting
Автор

Models are based on historic information. There’s no bias.

deestort