The future of AI looks like THIS (& it can learn infinitely)

preview_player
Показать описание
Liquid neural networks, spiking neural networks, neuromorphic chips. The next generation of AI will be very different.
#ainews #ai #agi #singularity #neuralnetworks #machinelearning

Thanks to our sponsor, Bright Data:
Train your AI models with high-volume, high-quality web data through reliable pipelines, ready-to-use datasets, and scraping APIs.

Viewers who enjoyed this video also tend to like the following:

Here's my equipment, in case you're wondering:

0:00 How current AI works
04:40 Biggest problems with current AI
9:54 Neuroplasticity
11:05 Liquid neural networks
14:19 Benefits and use cases
15:08 Bright Data
16:22 Benefits and use cases continued
21:26 Limitations of LNNs
23:03 Spiking neural networks
26:29 Benefits and use cases
28:57 Limitations of SNNs
30:58 The future
Рекомендации по теме
Комментарии
Автор

Thanks to our sponsor, Bright Data:
Train your AI models with high-volume, high-quality web data through reliable pipelines, ready-to-use datasets, and scraping APIs.

Viewers who enjoyed this video also tend to like the following:

theAIsearch
Автор

Seems to me like a lot of people compare the "learning" a human does during it's lifetime to the training process of a LLM. I think that it would make more sense to compare the training process of a neural network to the evolution process of the human being, and the "learning" a human does during its lifetime to in-context learning in a LLM.

enzobarbon
Автор

I built a custom spiking neural network for resolving a last-mile logistics efficiency problem.

I agree with your assessment:

Very efficient.

Complex logic.

kevinmaillet
Автор

How do human brain actually work? It feels like all the research out there are still incomplete

wmk
Автор

The opening statement is so true. As a student of this field, I think that this is not said enough and anyone not well versed in machine learning just does not get how bad the current situation is.

minefacex
Автор

Absolutely would love that video about the Neuromorphic Chips!!

keirapendragon
Автор

The human brain developed over a time span of millions of years. How much energy did that process use?

eSKAone-
Автор

8:33 not only is this "human brain" computer more efficient, but I heard the first stages of creating a new instance is pretty fun, can't confirm, never done it, but they say it is

olhoTron
Автор

1. It's funny that we are trying to create something (AGI) that replicates something else that we do not understand (the human brain).

2. Any neural network that truly emulates the human brain won't need to be trained in the sense you discuss. It would just be. It would learn and be trained by it's design. It would start training immediately and continue to train throughout it's existence. I don't see us ever creating something like this anytime soon (see statement #1).

High-Tech-Geek
Автор

Just waited for somebody to point the tremendous energy problems of current AI.
Thank you

vladartiomav
Автор

First and foremost, I am a biologist, but I have quite an extensive background in computer science as well. I have some fundamental concerns with the efforts to develop AI, and the methodologies being used. For these models to have anything like intelligence, they need to be adaptable, and they need memory. Some temporal understanding of the world. These efforts with LNN strike me as attempting to re-invent the wheel.

Our brains are not just a little better at these tasks than the models. They are exponentially better. My cats come pre-assembled with far superior identification, and decision-making systems. Nevertheless, that flexibility and adaptability require an almost innumerable set of 'alignment' layers to regulate behavior, and control impulses. To make a system flexible, and self-referential is to necessarily make it unpredictable. Sometimes the cat bites you. Sometimes you end up with a serial killer.

dmwalker
Автор

HOLD ON A SECOND!!! did you think we wouldnt notice???☠☠ 3:52

benfrank
Автор

We currently simulate neural networks programmatically, which is why they are so inefficient. The problem is, people are so impatient for AGI that they have concentrated all their efforts on achieving it rather than developing an actual neural network.

WJ
Автор

Definetly make a video on neuromorphic chips
And i think the other neural networks outside the scope of this video should have their seperate video as well

Ding
Автор

Spiking neural networks and any future neural networks, sounds like where agi and asi are actually at.

kairi
Автор

Glad to hear this. Back in 2010 I was looking for an alternative to network graph system still in use today. Basically we have a scaled version of a decades old tech that we now have the horsepower to run. I will say current neural matrices are but a partial model of a brain. I studied a portion of grey matter neural matrices see the book Spikes. "Fred Rieke, David Warland, Rob de Ruyter van Steveninck, William Bialek". The human brain is exponentially more complex than any scaled multi million GPU, TPU, CPU system. Good video.

alabamacajun
Автор

8:40 "Human brain only uses 175kWh in a year" - Since human brain cannot work without the body you have to treat [brain+body] as one entity (which is ~4 times more), unless it's a brain in a jar.. but yea i guess still very efficient.

TheCategor
Автор

i wrote a spiking neural network from scratch, it can learn but it's not as efficient as learning as typical nn as you can't do gradient descent effectivey, instead you need to adjust the neurons based on a reward.

now you can backtrace and reward the last neurons and synapses that resulted to the output you want but it is limited, it works better when you don't just reward the lasts, but reward according to the desired output, still, pretty cool to run and it makes nice visualizations.

alkeryn
Автор

Awesome content, as always. I would love to know more about neuromorphic chips. Thanks.

wellbishop
Автор

this is my first video from your channel, and I am already impressed!

saurabhbadole