Transformers with Knowledge Graph | K-BERT: Enabling Language Representation with Knowledge Graph

preview_player
Показать описание
In this video, we review the paper K-BERT: Enabling Language Representation with Knowledge Graph.
This paper proposes an approach to connect BERT models to external knowledge graphs like Wikipedia knowledge graphs, medical knowledge graphs, or any other knowledge graph.

If you want to refresh your understanding of transformers and self-attention check out these videos :
.
.
Some of my other popular videos on Deep learning :

.
.

Music from #Uppbeat (free for Creators!):
License code: HMQG1YGQY83EWIPU
Рекомендации по теме
Комментарии
Автор

Thanks for your explanation. Good job!

ApusApus-
Автор

I want to share my experience with this approach. I am currently working with this model for my thesis project. It seems that the author used their own dataset, which, as far as I can see, is in Chinese. The paper shows promising results with this approach. However, when I implement this with my own data, as well as my own custom knowledge graph, it fails. During training, it consistently shows an "Error: Divided by Zero, " indicating that the model fails to classify one or more labels. Due to this, I am currently exploring alternative ways of implementing a knowledge graph into a transformer-based architecture, in this context, is BERT.

ariefpurnamamuharram