Are Transformers Good Learners? Exploring the Limits of Transformer Training [ LingMon #175 ]

preview_player
Показать описание
📅 May 17, 2021 | 🗣 Dušan Variš (ÚFAL MFF UK) | 📝 Are Transformers Good Learners? Exploring the Limits of Transformer Training

🔖 ABOUT THE LECTURE
▔▔▔▔▔▔▔▔▔▔▔▔▔
▸In recent years, research into deep neural networks lead to significant advancements in many fields, ranging from NLP, across computer vision to playing games like Chess and Go. Even though deep neural networks were originally inspired by biological neurons, there are still many differences between deep nets and their biological counterparts. In this talk, we focus on three potential weaknesses of the neural network training: generalization, catastrophic forgetting and knowledge composition. We demonstrate how deep neural networks struggle with these phenomena, even though they are crucial to learning in their biological counterparts. We also discuss current approaches that focus on solving these issues.

📽️ ABOUT THE SERIES
▔▔▔▔▔▔▔▔▔▔▔▔
▸The history of Linguistic Mondays dates back to the 1980s and their original aim was to make both the students and the faculty members as well as the wider research community aware of the field of computational linguistics in general and of the results achieved by the members of our team in particular. During the years, with the growing awareness of the domain and with new trends appearing on the scene and with more master and doctoral students coming in, the scope of the topics introduced has broadened correspondingly, covering all aspects of the field from the basics of computational linguistics, its linguistic and formal background through corpora case studies and natural language processing applications such as machine translation and information retrieval up to the most modern trends including machine learning. It is also offers an excellent opportunity for PhD students to present their results and to receive a relevant response from leading experts in the field.

❔ SOCIAL MEDIA
▔▔▔▔▔▔▔▔▔▔
Рекомендации по теме