Transformers and LLM: History, Theory, Intuition and Implementation | By MIT, Purdue PhDs

preview_player
Показать описание
"Transformers: The heart of LLMs"

The heart of LLMs like GPTs is transformers. If you have attempted to learn transformers, you must have realized that it is very hard. Each time I try to understand it further, I learn something new.

At Vizuara, we wanted to create a comprehensive webinar series on transformers that will serve as the go-to resource for anyone who wishes to learn transformers, including its implementation.

Sahil Pocker, me, Rajat Dandekar and Raj Abhijit Dandekar have organized this lecture where you will; learn to train LLMs from scratch.

You will also build awesome hands-on projects in this lecture.

Transformers were introduced in the 2017 paper: "Attention is all you need". This beautiful 15 page-paper is however very difficult to understand for most people.

The singular goal of our lecture is to give a good idea about what exactly are transformers, how is it a neural network, and how is it used in LLMs like ChatGPT.

We cover the following points:

1️⃣ History of Large Language Models (LLMs)
2️⃣ What are transformers?
3️⃣ Intuition behind transformers
4️⃣ Transformer architecture
5️⃣ Applications of transformers for LLMs
6️⃣ Building awesome hands-on projects!

The amount of blood, sweat, and research that went into making an almost beginner-friendly workshop was enormous.

We strongly believe that this lecture will tremendously benefit you if you wish to learn about LLMs, transformers, attention mechanisms, the intuition behind positional encoding, etc. in detail.
Рекомендации по теме
Комментарии
Автор

Large Languge Models and trasformers. Good introduction.

padmanabhannamboothiri
Автор

can You please give video on tabtransformer

vishalakshi
Автор

Just a friendly suggestion: pls change the thumbnail to include better photos

vikramganesh