Arithmetic Transformers with Abacus Positional Embeddings | AI Paper Explained

preview_player
Показать описание
In this video we dive into a recent research paper, titled: Transformers Can Do Arithmetic with the Right Embeddings. The paper introduces Abacus Embeddings, a new type of positional embeddings. Using Abacus Embeddings, the researchers were able to train state-of-the-art transformers for numbers addition, with impressive logical extrapolation capabilities - a model that was trained on 20-digit numbers, was able to achieve 99.1% accuracy over 100-digit numbers!

Watch the video to learn more.

-----------------------------------------------------------------------------------------------

👍 Please like & subscribe if you enjoy this content
-----------------------------------------------------------------------------------------------
Chapters:
0:00 Introduction
0:49 Abacus Embeddings
2:39 Logical Extrapolation
3:48 Input Injection
4:25 Recurrency
Рекомендации по теме
Комментарии
Автор

This channel is amazing! I feel inspired by the simplicity of the ideas and their results. So many low-hanging fruits!

vladyslavkorenyak