Lecture 36: Alan Edelman and Julia Language

preview_player
Показать описание
MIT 18.065 Matrix Methods in Data Analysis, Signal Processing, and Machine Learning, Spring 2018
Instructor: Alan Edelman, Gilbert Strang

Professor Alan Edelman gives this guest lecture on the Julia Language, which was designed for high-performance computing. He provides an overview of how Julia can be used in machine learning and deep learning applications.

License: Creative Commons BY-NC-SA
Рекомендации по теме
Комментарии
Автор

Prof Gilbert Strang, I was supposed to attend your courses - Linear algebra and Matrix methods for ML as a prerequisite for my graduate program in Electrical engineering at UBC, Canada. I must say, I lost in track of time attending your lectures and you changed my perspective of thinking mathematics. Many Thanks and I owe you a lot for inspiring me. I wish to become a professor like you in future. Wish me well if you happened to see this message! Lots of love, from Simran Suresh, India

simrans
Автор

Watched 18.065 right after 18.060, finally finished both playlists now, thank you MIT and Prof. Strang for this two informative and engaging courses!

michaelmuller
Автор

“Everything is better when written put in linear algebra”, so true

jjpp
Автор

I followed every lecture, thanks Prof Strang, the course changed my view of Linear Algebra

agustinlage
Автор

2019-09-17 02:27:12

Thank you Prof. Strang, I wish you all the best

MIT
Автор

Professor Alan Edelman, thank you for a beautiful lecture and a power introduction to Julia. I watched and take notes from all 36 lectures and they are awesome. I learned a tremendous amount of Linear Algebra from the Grandfather of all mathematics. Thanks to MIT and all the wonderful people behind the scenes for putting these lectures together.

georgesadler
Автор

finished!!! But I probably need to watch them all over again

terrylu
Автор

What an amazing Course! Thank you professor Strang for 18.065!

leandrolopes
Автор

Thank you Prof Strang. Wonderful course. Have learned to love Linear Algebra - something I never managed in the 60's. ;¬D All the very best

anthonyhopkin
Автор

Thank you Prof Strang! See you at lecture one again pretty soon xD

tian
Автор

this is the last, im sad, because these lessons are great, thank you ! I proceed ...

archibaldgoldking
Автор

Thank you, Professor Strang! I've learned a lot from your lectures! Stay healthy!

minyuhan
Автор

I just realized that (x, 1) as the dual number is x and the derivative of x as a linear function Ax where A=1.

j
Автор

You are the best professor, thank you.

junhuichi
Автор

After seeing julia language.. I think it is most suitable for Machine Learning Researcher or AI Researcher job title..

ravikumar-vrzm
Автор

I don't know Julia, so I'm having a hard time following, but from what I can gather, all professor Edelman did in order to get "forward diff" was to define a "tuple" that holds both f(x) and f'(x). Then, he defined some operator overloads so that any iterative algorithm that calculates/approximates f(x) will also calculate f'(x). Is this right? If so, it doesn't seem that Julia would be the only language that you can do this in. All you need is polymorphism/overloading, which I think any modern language would support (e.g. C++, Python, which were both "ruled out").

allyourcode
Автор

This type of differentiation seriously blew my mind! :-))) Something that I think can be seriously seriously useful. A bit difficult though to understand it right from this presentation....

Where this technique should really score is when you need to do derivatives on numerical functions that you cannot really differentiate by calculus to get high accuracy derivatives. Meaning for example calculating derivatives for parameters when using ode solvers and other alike stuff.

For those you would have to either do finite difference approximations which are quite bad (and meaning also simulating the ode twice at least, thus effectively doing same amount of calculations), or well how on earth would you calculate an analytical derivative for the whole chain of operations? This would be rightout crazy it seems yet maybe not impossible ... ?

Is there any comparison for how well the differentiation method performs with the log function which is calculated as a continued fraction? I mean how accurate is the dual component of log(x_dual) compared to 1/x? And in case 1/x is better would it then make sense to implement a custom derivative part returning 1/x to the dual component?

sschmachtel
Автор

Mister Alan Edelman if you can hear me . Would it be possible to implement "physical datatypes" which contains value as well as unit. wehre for example units wood be as well multiplied as values (in Multiplications) i think this would also be a great relief. i for my self work with quantum mechanics and units are always a problem. i try to implement the above explained concept in python

qqqqqqqw
Автор

Prof. Edelman, Are you saying that generic programming and support for types and operators (+, -, /, * etc.) results in these benefits? Or are there other language features in Julia that make this possible? Thank you for making the lecture available to the world.

muppalaneninitin
Автор

Alan Edelman is unmistakably erudite to state the concordance had by Linear Algebra with the applied topics of Machine Learning and Neural Networks.

willjennings