Why the Future of AI & Computers Will Be Analog

preview_player
Показать описание


Video script and citations:

Get my achieve energy security with solar guide:

Follow-up podcast:

Join the Undecided Discord server:

👋 Support Undecided on Patreon!

⚙️ Gear & Products I Like

Visit my Energysage Portal (US):
Research solar panels and get quotes for free!

And find heat pump installers near you (US):

Or find community solar near you (US):

For a curated solar buying experience (Canada)
EnergyPal's free personalized quotes:

Tesla Referral Code:
Get 1,000 free supercharging miles
or a discount on Tesla Solar & Powerwalls

👉 Follow Me
Mastodon

X

Mastodon

Instagram

Facebook

Website

📺 YouTube Tools I Recommend
Audio file(s) provided by Epidemic Sound

TubeBuddy

VidIQ

I may earn a small commission for my endorsement or recommendation to products or services linked above, but I wouldn't put them here if I didn't like them. Your purchase helps support the channel and the videos I produce. Thank you.
Рекомендации по теме
Комментарии
Автор

(circa 1960) "It would appear that we have reached the limits of what it is possible to achieve with computer technology, although one should be careful with such statements, as they tend to sound pretty silly in 5 years."" — John von Neumann

ShawnHCorey
Автор

The biggest drawback of analogue circuits is that they are quite specific to a problem, so making something that will work for most/generic problems is difficult, where a digital computer is trivially easy in comparison. But, when you need something specific, the analogue computer can be significantly faster and more energy efficient. I look forward to the hybrid components that will be coming out in the future.

billmiller
Автор

My analog computer has ten fingers and ten toes.

mikesheahan
Автор

Oof. No offense, but as an analog electrical engineer you should have consulted with experts instead of press releases from marketing before making the video. Analog engineering is about tradeoffs, you can minimize power consumption but will lower almost every other figure of merit. You should also at least mention electrical noise, since it is one of the biggest downsides of analog compared to digital circuits.

santawashere
Автор

1:32 No, it may seem like a infinite set, but in practice it's limited by the signal-to-noise ratio, the SNR is efectively the number of "bits" of a analog computer, the rule of thumb is 6db ~ 1 bit, also each component adds its own noise on top of the signal, you lose "bits" as your computation becomes more complex, BTW that is also kinda true on digital computers, if you use floating point numbers you lose some precision with each rounding, however on digital its easier to just use more bits if you need them, on analog decreasing noise is not so trivial

olhoTron
Автор

I built and used analogue computers (although I didn't call them that) to control the temperatures in a multi-layered cryostat for my Ph.D work in the mid '70s. I did the number crunching on a digital computer that filled the basement of the maths dept building using punch card input. 😮. 20 odd years later I found myself working with an engine management computer for a helicopter that was pure analogue. When I approached the system engineer at a large well known aerospace company who had the design control authority for the system to ask some fundamental questions about it he didn't have a clue - he was purely digital. I'm retired now but if drag my knowledge out of the closet along with my flared jeans and tie-dyed T-shirts perhaps I'll come back into fashion. 😁

Legg
Автор

I studied analog computers/computing in the 1970s as part of my electrical engineering education. At one time (after that) I worked for a company in Ann Arbor, Michigan that made some of the most powerful analog computers in the world. (I was in marketing by then.) They were used, among other things, to model nuclear reactors and power plants. Incredibly powerful.

brucefay
Автор

I think the best way to sum it up is: Analog is fast and efficient, but is hard to design (or at least formulate the problem). Once you build it, it is only good at solving that specific problem.
Digital on the other hand is much more flexible. You have an instruction set and can solve any problem that you can write an algorithm for using those instructions, so you can solve multiple problems with that same machine. Tradeoff is the mentioned slower speed and efficency (compared to analog).
My favourite story about digital vs. analog is the Iowa-class battleships: They were built in the 1940s and were reactivated in the 80s. The fire control computers (electromechanical analog computers using cams, differentials and whatnot) were state of the art back in the day, but given the invention of the transistor and all that since then, the navy did look at upgrading them to digital. What they found is that the digital system did not offer greater accuraccy over the analog. While over 40 years technology advanced quite a bit, the laws of physics remained the same so the old analog computers worked just as well.

keresztesbotond
Автор

I'm skeptical about the broad assertion - 'Why the Future of AI & Computers Will Be Analog' - that analog computing will dominate in the future. Analog computing clearly has its niche, particularly with tasks involving continuous, real-world signals—such as audio and visual processing, or interpreting sensor data—where this technology presents a clear advantage. However, framing this niche strength as 'the future' and implying a universal superiority over digital computing seems a bit overstated to me.

BenGrimm
Автор

This video takes the cake of the most buzz words and least amount of substance that isn't talking solely about 'a.i.'..

ligerstripe
Автор

Great video, thanks for sharing. The biggest problem with analog computers is there are so few people that know how to work on them. I’m reminded of a hydro electric plant I toured once that had an electro mechanical analog computer that controlled the units. At the time I visited it was already considered to be ancient and they were actively attempting to replace it simply because know body knew how it worked. They only knew how to turn it on, off, and wind the clock spring once a shift the exact number of spins that it needed to keep running. They had been trying to replace it with a new computer but none of the many attempts could match its precision in operating the plant and maintaining proper water flows. They were in constant fear that it might break. I checked back maybe 20 years later to ask how it was and know one working their knew what I was talking about. Sad that it was long forgotten by everyone at the plant. I thought it should have been retired to a museum, and still hope that possibly it was.

tbix
Автор

I started my career with analogue computers in the 1970s as they were still being used in industrial automation for motor control (PID: Proportional, Integral and Differential). I worked in a repair centre and built some test gear to allow me to calibrate them. It's no surprise to me that they have come back, within certain niche applications they are very powerful, although not particularly programmable, unless you count circuit design as programming :-)

offbeatinstruments
Автор

Like, the first 75% of this video talk about what analog computers are not. Add some snappy comments and twist, and you still don't come close to what is mentioned in the video headline

Hippida
Автор

Analogue computing is analogous to the P vs NP problem in pure mathematics. It is fantastic at anything which is hard to calculate, but quick to check. I'm this case, anything hard to figure out how to express, but with solidly defined parameters.

It works by shunting a good deal of the difficulty solving the problem up front to the designers of the machine. It can take years of incredibly hard work to figure out how to express a single problem in analogue form, but once you DO computing the answer for any combination or variant of the problem is virtually instantaneous.

icedreamer
Автор

Gave a talk at DataVersity on context. Explained in the 70's we could build a nuclear power plant instrument panel with gauges of different ranges - this allowed the techs to scan the whole wall and immediately see if anything looked out of normal range (usually straight vertical). However, everyone wanted digital, not realizing that with that change, each number had to be individually read, consciously scaled and then thought about (compare with 'normal'). With digital came the necessity for alarms because it took too much mental effort to scan the wall. Something few consider to this day...

johnmiglautsch
Автор

An analog clock is not continuous, its movement is a function of the internal gear ratios and the arm moves in discrete, quantifiable steps

DantalionNl
Автор

I wish there was a betting pool for every time Matt Ferrell was proven incorrect because he believed some marketing hype. I would be a billionaire.

nitehawk
Автор

Personally I think the Order of Mentats is the future of computing

sigis
Автор

lol no. A nightmare to debug this thing.

youtubehandlesux
Автор

Whenever I design electronics, I often use analog preprocessing, since it takes much less energy to amplifie, integrate or filter signals with OP-amps using summing, PT1-, PT2-models, integrators and differentiators instead to using convolutions or FFT to construct FIR-, or IIR-filter, that needs a lot of processing power.

debrainwasher