How big does a neural network need to be? | Oriol Vinyals and Lex Fridman

preview_player
Показать описание
Please support this podcast by checking out our sponsors:

GUEST BIO:
Oriol Vinyals is the Research Director and Deep Learning Lead at DeepMind.

PODCAST INFO:

SOCIAL:
Рекомендации по теме
Комментарии
Автор

shouldn't it be how dynamic does a neural network need to be? Deterministically computing a whole bunch of weights seems to preclude employing higher order dynamic feedbacks. Feedbacks like having groups of parameters be able to auto bias groups of weights real time. A bit like the system has an introspective awareness that can gainfully generate complex algorithmic strategies above the weights, to say more efficiently make outputs conform to certain requirements rather than having screeds of training to fill out the solution space. Such that, the AI could now take inputs reimagine them according to some meta interpretation, then reinput a newly transformed meta state, process again to make further decisions based on some other meta criterions to internally bias the model, then maybe output, reinput, or perhaps self modify portions of the model so the AI is in a constantly evolving quest for some sense of meaning about the whole shebang lol...

blengi
Автор

Oriol Vinyals for some reasons looks like an absolute gigachad

polyloly
Автор

Stuart Hameroff would be a great guest, Orc OR is a compelling theory and raises the goal-post for number of connections needed to emulate human cognition.

deadlevelled
Автор

It's strange, we essentially have built the perfect billiards table, racked the balls and set the cue. The question becomes how do you make the game play itself with the only input being your break?

compositestechbb
Автор

How mathematics views neural, Till Now there are no straight study, a Basic and theorems put it,
It's not my field and my first look, I don't like a computer_Sci view, the neutral It's too much lack of pure mathematics,

__hannibaalbarca__