MAMBA from Scratch: Neural Nets Better and Faster than Transformers

preview_player
Показать описание
Mamba is a new neural network architecture that came out this year, and it performs better than transformers at language modelling! This is probably the most exciting development in AI since 2017. In this video I explain how to derive Mamba from the perspective of linear RNNs. And don't worry, there's no state space model theory needed!

#mamba
#deeplearning
#largelanguagemodels

00:00 Intro
01:33 Recurrent Neural Networks
05:24 Linear Recurrent Neural Networks
06:57 Parallelizing Linear RNNs
15:33 Vanishing and Exploding Gradients
19:08 Stable initialization
21:53 State Space Models
24:33 Mamba
25:26 The High Performance Memory Trick
27:35 The Mamba Drama
Рекомендации по теме
Комментарии
Автор

As someone actively working on this stuff, this channel has the best explanations on the internet, and the 'tuber actually understands what is going on.

jamescamacho
Автор

About peer review: As one comment noted, there could be many more candidate papers presented than could be accommodated at the venue. However, this video argues, the rejection justification for this paper is inadequate at best. Some comments ask whether the rejection is important; for academics, the answer is yes, because presentations and publications count for tenure, promotions, and raises plus continued funding of the research. Since several comments plus the video indicate that the algorithm had already received a lot of publicity, for the sake of the project it may not matter if it can continue to be funded, especially if commercial implementations are successful. What is interesting in any case is that the paper exists; in effect it has been published; the authors may not get the desired credit for formal publication, but their work and the reviewer comments are out there now. A couple of decades ago that would not have been the case; most people in the field would be unaware of the algorithm. In terms of peer review, in general (outside of AI), in my field, one of the natural sciences, a paper I submitted for publication encountered an editor plus two reviewers who were well qualified in the field; after asking for two revisions to the manuscript, the third version was rejected. Interestingly, all three scientists had published research which my paper undermined; they may well have lost funding for their research or even their position had that manuscript of mine been published (I speculate here). Peer review cuts both ways. While iterating with the editor and reviewers I continued to expand my research project and made some additional discoveries. Following the rejection I wrote a completely different paper which incorporated my initial work supplemented by the new discoveries; happily it was published a few months ago (in a different journal). I'm formally retired now, but continue to do research. To young researchers -- never give up. Learn from rejection, refine your work, be humble, exercise integrity and honesty, and take pride in your accomplishments, even if only a few know about them. Peer review (by humans) is a necessity and will continue to be. There is no such thing as a perfect filter, but science and technology would be overwhelmed by irrelevancy, dishonesty, and duplication of effort without it. AI may become a useful filtering tool, but science is a human endeavor.

RexPilger
Автор

wow, you've made some difficult i mean extremely difficult algorithms look easy. thank you.

jawadmansoor
Автор

Brutal. I'm going to have to watch this about 30 times. Love it.

peterdemore
Автор

I do hope you'll soon get at least 6 figures subscribers count. The quality of your videos (both in terms of education and presentation) is top notch, people need you to become popular (at least within our small tech bubble).

Levy
Автор

please open your community tab
your content is incredible

EkShunya
Автор

During my Ph.D times a paper of mine got rejected at ICASSP for not having quoted a certain paper (I guess the reviewer was one of the authors) which had absolutely NOTHING to do with what my paper was about... So yes, a lot in the reviewing process seems to be a) personal and b) must do this and that even if it is not related to your paper at all. Since years...

andreasbeschorner
Автор

One small note on RNN's, reservoir computing is a very high dimensional random RNN with linear regression readout, therefore there is no exploding nor vanishing gradient. Reservoir computing is currently the standard for non-linear dynamic time series prediction

jarib
Автор

Wow this is a great video. I've been having a lot of trouble understanding and getting an intuition of how Mamba works, and this video just made it make sense. The visuals were a massive help and the explanations are super simple and easy to understand.

shirenlu
Автор

Wow, excellent explaination. It covers all the essense of the paper with just enough math/algo. Thank you so much ! If you dont mind, plz make a video for RWKV (v6 has some new modifications), which is another strong linear RNN model. I am curious how does it compares to mamba.

honglu
Автор

Nice video! I just wanted to point out that the parallel scan algorithm can be also implemented in O(n) time (instead of the O(n log(n)) version peresented in the video. and this is the version that the MAMBA uses.

rikkathemejo
Автор

Absolutely amazing vid. Just subbed after getting recommended to this channel. Never stop making videos dude <3

kamdynshaeffer
Автор

I like how we now call 1 billion parameters small.

InfiniteQuest
Автор

Currently testing it on molecular generation, so excited to see where these strengths hold and where they falter :)

anrilombard
Автор

A+++ for OpenReview. Transparency is so valuable !
Also, many thanks for the excellent video !

davidespinosa
Автор

I love how you nail the level of detail in the explanations. Perfect for me at least.

kalkhasse
Автор

This was really concise and easy to understand.

AndrewAnderson-hd
Автор

Just wondering if you can make a video on how GNN works? There's not really many videos about GNN on youtube.

danverzhao
Автор

absolutely love the quality and information of this video!!! please keep up the good work this is amazing

ithaca
Автор

Very good explanation, and kudos for exposing the broken peer review system. Subscribed

TheParkitny