Analog computing will take over 30 billion devices by 2040. Wtf does that mean? | Hard Reset

preview_player
Показать описание

About the episode: This model of computing would use 1/1000th of the energy today’s computers do. So why aren’t we using it?

What if the next big technology was actually a pretty old technology? The first computers ever built were analog, and a return to analog processing might allow us to rebuild computing entirely.

Analog computing could offer the same programmability, power, and efficiency as the digital standard, at 1000x less energy than digital.

But would switching from digital to analog change how we interact with our technology? Aspinity is tackling the major hurdles to optimize the future landscape of computing.

◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠
Read more of our stories on future technology:
Inventions that are fighting the rise of facial recognition technology
The technology we (or aliens) need for long-distance interstellar travel
3 emerging technologies that will give renewable energy storage a boost
◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡

Watch our original series:

◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠◠
About Freethink
No politics, no gossip, no cynics. At Freethink, we believe the daily news should inspire people to build a better world. While most media is fueled by toxic politics and negativity, we focus on solutions: the smartest people, the biggest ideas, and the most ground breaking technology shaping our future.
◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡◡

Enjoy Freethink on your favorite platforms:
Рекомендации по теме
Комментарии
Автор

Noise is by far the biggest issue with analog computing, it accumulates at every step and getting rid of it is very complicated. I would have loved to hear how they deal with it, but I guess they wanted to attract investors with buzzwords instead.

lbgstzockt
Автор

That was pretty low on content and very high on marketing talk. Next time please be more specific and in depth to provide factual information. Otherwise viewers will confuse your video as an advertisement.

stupidPresident
Автор

I love the certainty they said 30 billion units in 10ish years and the vagueness of the actual way it works "Yeah, a bit of analogue and a bit of digital". My degree is in robotics and mechatronics engineering so I am interested in this topic but learnt nothing from this video.

Martian
Автор

We've had programmable analog arrays for a while now. They're similar to FPGAs, but with analog building blocks, and generally haven't seen much market adoption, since most mechanisms that need analog electronics can be made more cost-effective, efficient, and easy to produce by just building the analog circuits necessary as purpose-built systems. There are definitely interesting applications for field programmable analog arrays, but I think they're largely overestimating the general usefulness of it - understandable, since they want to make money!

Taskulare
Автор

Great, now make it available to the hobbyists so we can start exploring the possibilities.

devrim-oguz
Автор

this is yet another proof of "you can sell out 'nothing' to majority using good presentation"

mehmetedex
Автор

Okay, but how does it work? This is just a long marketing video

tyleri.
Автор

I want my 11.75 minutes back. Far too much "let me sell you a dream" than actual information in this video.

CH.LSDuigi
Автор

"We have solved the voltage offset problem, but we won't tell you anything on how we do it" Great. That was the the most important thing to know.

ryudragon
Автор

Every time they say 'wake up a digital chip to do communication' you should know that they are selling snake oil.
Yes everything in the real world is analog, and so is all communication including wifi and ethernet. The physical layer is always analog.
If there was any substantial info in this video, I would have hoped for them to show a minimal sensor reading transmitted wirelessly (433MHz or similar). It doesn't have to be wifi. But that's a great buzzword that everyone knows...
And just by repeating 'it will be more efficient' doesn't improve your claim at all.

All the stuff about analog co-processors that wake up larger digital ones, e.g. voice activation, is already done with digital coprocessors.

raw_
Автор

I don't know enough to be able to call them out on any specifics but the whole way through the video a bunch of red flags were going off in the back of my mind, and the fact that at no point they explained how this thing works just makes me doubt that it actually does work at all.
You can't simply say that you addressed the issue of analog signals degrading over time without any further insight on how, and what about the general noisiness of analog? I remember being taught that the problem with trying to do calculations on analog measurements is that the small errors accumulate into much larger errors very quickly, hence why we convert stuff into fixed (digital) numbers first, so this would only be useful for simple tasks like comparing input against predefined patterns and then it would need to call a digital system to do anything even remotely complex.
I'd love to be wrong and see this actually taking over the world in a decade or so, it'd be really cool, but for now I'm gonna be cautiously skeptic about the whole deal.

axelprino
Автор

CEO: "AI and machine-learning algorithms"

Investors: Shut up and take our money!

joeschembrie
Автор

As an analog design guy I know very well how hard is it to tune an ASIC for each user case, even with a lot of programmability it will require a lot of designers. Designers that are not available, as the training is hard.

hellsing
Автор

I've been intrigued by the potential of analog computing since I played around with classic analog computers in college. That said, I'm not so sure power savings for sensing will be the driving force in adoption of this new version of the technology. Perhaps the most promising opportunity is actual in AI inference engines which rely on comparison and relative closeness rather than absolutes.

johnhorner
Автор

The board, the chip, the battery. Looks normal. Speaking guy, CEO? maybe overselling the revolution??

jonr
Автор

I watched the whole thing and heard a bunch of "this technology". What is it, though. How does it work? Is there a paper I can read that tells me how they are doing this? The power consumed by an Analog to Digital converter isn't significant enough to claim this much power savings. I had to search for details on their AML100 chip to figure out what they are talking about. Maybe adding a bit of technical detail or providing links for those interested would be a good idea in the future. Without that info it is difficult to decide whether this is pathbreaking technology or just another Turbo Encabulator.

frustratedalien
Автор

Not that I don't believe that analog could be more power efficient, but it would have helped if you did some back of the napkin calculations to illustrate by how much.

ropro
Автор

Why do I get the sense I'm watching an infomercial?

WickerDuck
Автор

I do believe analog is the right choice for things like machine learning, since it has a probabilistic foundation. But for practically any other case digital is far better; the rise of digital was mostly due to its help in discrete computing of hard numbers in banking and offices, that is never going away.

ChileanRaccoon
Автор

Analog never went away don't worry :-). Every tech company has someone who knows their way around analog filters, oscillators and control systems as a bare minimum. The market for discrete components is still a growing one. Digital is simply quite often a cheaper solution as it takes less time to develop and is highly configurable, but whenever things like cost for mass production, reliability, and longevity play a role, analog never went away. Some industrial PI(D) controllers, low noise power supplies, RF amplifiers, calibration equipment... In some cases only the interface is digital, the part that does the thinking is usually fully analog...

I'm absolutely of the opinion most problems in electrical engineering can be solved with analog systems, as a hobby I enjoy designing circuitry to tackle these things. But working for a company where electronics design has to pay for the paycheck of me and my colleagues, you need to pick whatever is the cheapest and fastest to produce and a digital system is usually preferred if its nothing too critical. Digital systems are very robust as well, you can effectively store values as binary data rather than voltages and it is very resistant against electrical noise! With analog systems you're constantly thinking about shielding, ground loops, ground planes, crosstalk due to parallel wires or traces, manufacturing tolerances of components, power consumption, non-linearities, parasitic capacitance and inductance, trace resistance, bandwidths, temperature drift... Digital circuitry just works straight out of the box and is resistant to almost all these things.

Analog technology is big in footprint too, you're not working with components that switch on or off, you work with components that are either optimized for linearity, tolerance, matching, or show very logarithmic behaviour, temperature stability, thus strictly speaking for active components, the surface area such a component needs to have is much bigger. The more accurate your computation has to be, the more parts you need to add to your topology to stay within your needed tolerances. We sometimes ovenize parts for stability, or slap a heating element right next to it (a resistor or a carefully "shorted" transistor) simply to fight these things. Digital systems behave the same over a very wide temperature range without losing accuracy, so in many applications digital systems may be much more power efficient.

I think the rise of "analog computing" is mainly a terminology thing. Yes, an analog control system is actively real-time "computing" and solving complicated differential equations, but since it just inherently is meant to do that, you don't say it is computing, it is stabilizing the system. You don't refer to it as a computer but simply a controller. We don't call microcontrollers computers either anyways. Analog computing is nowadays used as a buzz word. Real analog computers ARE highly configurable instruments, but also very impractical and competed out of the market many decades ago for that reason.

Besides this, education tailors their studies towards demand of research and companies. Digital systems have way more practical use cases, so when you go study electrical engineering, most of the courses will be centered around digital systems. The result of this is obviously that most people by percentage will specialize in digital electronics. This is not a bad thing, there's still many people whose first projects were building guitar amplifiers or radios and just sit through education in order to go back to analog stuff again.

Analog and digital signals processing is an age-old established marriage, it was and still is a big field of research and education, resulting in an insanely large number of DACs and ADCs being pushed to the market to facilitate this.

Although I am very passionate about analog electronics design myself, I think it is incredibly important to also understand it's limitations and impracticalities. There's a lot of benefits, but also a BIG set of impracticalities you do not put forward. Videos like these leave me very divided. At one hand I love seeing analog stuff be represented outside of the audiophile world in a community that is dominated with microcontrollers, at the other hand I feel like you are wearing prominent eye-flaps and are underestimating the amount of analog specialists still walking around today. There's literally conventions for this stuff being held everywhere, breakthroughs are still being made by companies pushing better and better analog components to the market.


Analog will never leave the tech industry. For as long as the world we live in remains analog, the specialization will exist, and the amount of people working within it will grow or shrink depending on demand and practicality.

BeesKneesBenjamin