Is the Future of Linear Algebra.. Random?

preview_player
Показать описание

"Randomization is arguably the most exciting and innovative idea to have hit linear algebra in a long time." - First line of the Blendenpik paper, H. Avron et al.

SOCIAL MEDIA

SUPPORT

SOURCES

Source [1] is the paper that caused me to create this video. [3], [7] and [8] provided a broad and technical view of randomization as a strategy for NLA. [9] and [12] informed me about the history of NLA. [2], [4], [5], [6], [10], [11], [13] and [14] provide concrete algorithms demonstrating the utility of randomization.

[1] Murray et al. Randomized Numerical Linear Algebra. arXiv:2302.11474v2 2023

[2] Melnichenko et al. CholeskyQR with Randomization and Pivoting for Tall Matrices (CQRRPT). arXiv:2311.08316v1 2023

[3] P. Drineas and M. Mahoney. RandNLA: Randomized Numerical Linear Algebra. Communications of the ACM. 2016

[4] N. Halko, P. Martinsson, and J. Tropp. Finding Structure with Randomness: Probabilistic Algorithms for Constructing Approximate Matrix Decompositions. arXiv:0909.4061v2 2010

[5] Tropp et al. Fixed Rank Approximation of a Positive-Semidefinite Matrix from Streaming Data. NeurIPS Proceedings. 2017

[6] X. Meng, M. Saunders, and M. Mahoney. LSRN: A Parallel Iterative Solver for Strongly Over- Or Underdetermined Systems. SIAM 2014

[7] D. Woodruff. Sketching as a Tool for Numerical Linear Algebra. IBM Research Almaden. 2015

[8] M. Mahoney. Randomized Algorithms for Matrices and Data. arXiv:1104.5557v3. 2011

[9] G. Golub and H van der Vorst. Eigenvalue Computation in the 20th Century. Journal of Computational and Applied Mathematics. 2000

[10] J. Duersch and M. Gu. Randomized QR with Column Pivoting. arXiv:1509.06820v2 2017

[11] Erichson et al. Randomized Matrix Decompositions Using R. Journal of Statistical Software. 2019

[12] J. Gentle et al. Software for Numerical Linear Algebra. Springer. 2017

[13] H. Avron, P. Maymounkov, and S. Toledo. Blendenpik: Supercharging LAPACK's Least-Squares Solver. Siam. 2010

[14] M. Mahoney and P. Drineas. CUR Matrix Decompositions for Improved Data Analysis. Proceedings of the National Academy of Sciences. 2009

TIMESTAMPS
0:00 Significance of Numerical Linear Algebra (NLA)
1:35 The Paper
2:20 What is Linear Algebra?
5:57 What is Numerical Linear Algebra?
8:53 Some History
12:22 A Quick Tour of the Current Software Landscape
13:42 NLA Efficiency
16:06 Rand NLA's Efficiency
18:38 What is NLA doing (generally)?
20:11 Rand NLA Performance
26:24 What is NLA doing (a little less generally)?
31:30 A New Software Pillar
32:43 Why is Rand NLA Exceptional?
34:01 Follow Up Post and Thank You's
Рекомендации по теме
Комментарии
Автор

Great video! I want to add a couple of references to what you mentioned in the video related to neural networks:
1. Ali Rahimi got the Neurips 2017 "test of time" award for a method called - Random kitchen sinks (kernel method with random features).
2. Choromansky (from Google) made a variation of this idea to alleviate the quadratic memory cost of self-attention in transformers (which also works like a charm - I tried it myself, and I'm still perplexed how it didn't become one of the main efficiency improvements for transformers.). Check "retrinking attention with performers".

Thank you for the great work on the video - keep them coming please! :)

charilaosmylonas
Автор

reminds me of that episode of veggie tales when larry was like "in the future, linear algebra will be randomly generated!"

octavianova
Автор

As a mathematician specializing in probability and random processes, I approve this message. N thumbs up where N ranges between 2.01 and 1.99 with 99% confidence!

BJ
Автор

The part about matrix multiplication reminded me of studying cache hit and miss patterns in university. Interesting video.

TimL_
Автор

As a developer at AMD I feel somewhat obligated to note we have an equivalent to cuBLAS called rocBLAS, as well as an interface layer hipBLAS designed to compile code to make use of either AMD or NVIDIA GPUs.

laurenwrubleski
Автор

Another tidbit about LinPack: One of its major strengths at the time it was written was that all of its double precision algorithms were truly double precision. At that time other packages often had double precision calculations hidden within the single precision routines where as their double precision counter parts did not have quad-precision parts anywhere inside. The LinPack folks were extraordinarily concerned about numerical precision in all routines. It was a great package.

It also provided the basis for Matlab

charlesloeffler
Автор

Brunton, Kutz et al. in the paper you mentioned here "Randomized Matrix Decompositions using R, " recommended in their paper using Nathan Halko's algo, developed at the CU Math department. B&K give some timing data, but the time and memory complexity were already computed by Halko, and he had implemented it in MATLAB for his paper - B&K ported it to R. Halko's paper from 2009 "FINDING STRUCTURE WITH RANDOMNESS: STOCHASTIC ALGORITHMS FOR CONSTRUCTING APPROXIMATE MATRIX DECOMPOSITIONS" laid this all out 7 years before the first draft of the B&K paper you referenced. Halko's office was a mile down the road from me at that time, and I implemented Python and R code based on his work (it was used in medical products, and my employer didn't let us publish). It does work quite well.

scottmiller
Автор

Golub and Van Loan’s textbook is goated. I loved studying and learning numerical linear algebra for the first time in undergrad.

richardyim
Автор

I have seen a very similar idea in compressed sensing. In compressed sensing we also use a randomized sampling matrix, because the errors can be considered as white noise. We can then use a denoising algorithm to recover the original data. In fact I know Philips MRI machines use this technique to speed up scans, because you have to take less pictures. Fascinating

pietheijn-vogt
Автор

I'm writing a paper on a related topic. Didn't know about many of these papers, thanks for sharing! I really enjoyed your video

danielsantiagoaguilatorres
Автор

Dang, I absolutely love videos and articles that summarize the latest in a field of research and explain the concepts well!

zyansheep
Автор

As always this is BRILLIANT. I started following your videos since I saw the GP regression video. Great content! Thank you very much.

charlesity
Автор

You discussed all the priors incredibly well. I didn’t even understand the premise of random in this context and now I leave with a lot more.

Keep it up man ur videos are the bomb

noahgsolomon
Автор

Damn... your videos are getting beyond excellent!

marcegger
Автор

My first thought was "this is like journal club with DJ"! Great stuff - well researched and crisply delivered. More of this, if you please.

bluearctik
Автор

I'm finally far enough in education to see how well made your stuff is. Super excited to see a new one from you. Thanks for expanding people's horizons!

makapaka
Автор

its always a pleasure to watch this channel

aleksszukovskis
Автор

I started reading this paper when you mentioned it on Twitter, forgot it was you who I got it from and was now so happy to see a video about it!

mgostIH
Автор

It feels like this video was made to match my exact interests LOL

I've been interested in NLA for a while now, and I've recently studied more "traditional" randomized algorithms in uni for combinatorial tasks (e.g. Karger's Min-cut). It's interesting to see how they've recently made ways to combine the 2 paradigms. I'm excited to see where this field goes. Thanks for the video and for introducing me to the topic!

deltaranged
Автор

Outstanding content, instant sub. Keep up the good work!

bnws