BERT Pre-training of Deep Bidirectional Transformers @ TWiML Online Meetup Americas

preview_player
Показать описание

This video is a recap of our May 2019 Americas TWiML Online Meetup: BERT Pre-training of Deep Bidirectional Transformers for Language Understanding.

In this month's community segment, we discuss our thoughts on the meetups, GP2, the availability of resources, and our TWiML Talk with Delip Rao on Fake News.

In our presentation segment, Jidin Dinesh presents on deep bidirectional representations and language representation models with the BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding paper by Jacob Devlin et. al.

Subscribe!

Lets Connect!

**SUBSCRIBE AND TURN ON NOTIFICATIONS**
Рекомендации по теме
Комментарии
Автор

hello can you send me the link for the paper what you explain at 30:40 mint because of i have BERT paper but not have this section in my paper
Thanking in advance

HariPrabodham_Spirit