Is Moore's Law Broken?

preview_player
Показать описание
For 50 years, Moore's Law has set the pace of computer innovation. Now it's breaking down. Will a leap forward in technology save Moore's Law or is it forever broken?

Super special thanks to our Patreon Patrons:
Colin Young, Joseph Hegeman, Andrew Arrabaca, Dan Goodes, Jeff Brice, Matt Altieri, Cristina Quiroz, Torstein, Jeremy Nauta, Chris Hicks, Lars Hermann, Max, Yoselin Gallegos, and Ashley Beraneck.

►Follow us on instagram: goodstuffshow

Music by

Todd Umhoefer (Old Earth)

Oscillator Bug

Chris Zabriski

Kevin Macleod

Jason Shaw

REFERENCES:
Рекомендации по теме
Комментарии
Автор

The background music in the past few episodes has been *amazing*.

heathmccasland
Автор

So... a few really weird things about following the tech industry and metrics like performance, efficiency and (of course) Moore's Law.
1) While mainstream consumer x86 architecture has slowed down dramatically over the last 5 years, it is not true for all computing technologies. GPU performance is still improving dramatically, and ARM based processors (used in everything from microwaves, to cars, to smartphones and tablets) improved very slowly in the past, but are not moving at a much faster 'moores-like' rate.
2) Mentioned, but not explored much in this, is core count. While improving core count to 6+ cores makes little to no difference in day-to-day computing use, it has HUGE improvements in the server market. Last year at the school district I work for I consolidated 14 physical servers down to 4. One is a backup server, one is a file server, and 2 house VMs (virtual machines) of the other 12 boxes. Sure, the individual core performance is not improving much over time, but massively parallel cores allow you to consolidate workloads of multiple machines and users down to a single box with little to no impact on performance. This means that end user devices can get smaller, lighter, and more gutless, as the end-user workloads are off-loaded to 'the cloud', and that cloud can continue to get physically smaller and smaller as density continues improving, even while the load is increasing dramatically.
3) While a lot of the reason for Moore's Law was self-fulfilling prophecy, a lot of it was due to a specific 'need' for more power. Angry Birds and Slither.IO do not drive the overall economy. MS Office and Google Chrome is what moves the economy. About 5 years ago we hit a wall of 'good enough'. It no longer took 3-5 minutes to open a word document... it took a mere 2-5 seconds. And while shrinking that to 1-2.5 seconds is a doubling in speed and improvement, it is no longer a practical difference in performance. If something takes 1-2 seconds or 5 seconds it does not impact performance very much. With very little practical efficiency to be gained by CPU improvements in these bulk-user tasks, there is smaller and smaller motivation to make improvements, especially as the improvements are more and more difficult to make.
4) In more recent years the 'new' definition of Moore's Law seems to be measured (at least by Intel) in performance per watt. Now, I am not always inclined to follow this line of thinking, but they are not entirely wrong either. While raw performance per-core is at a stand still, the efficiency (under load and idle) has been improving very quickly. A lot of this is due to the impending collision between x86 and ARM architectures. ARM has always been very good at being low power, but not particularly good at being efficient with workloads, which is why they have been great for phones and tablets that need minimal power usage, but don't have large heavy workloads (and why when you do give a phone a heavy workload it drinks your battery so quickly). x86 on the other had has generally been very efficient with heavy workloads, but is by its nature very power intensive. But as phones do more and more work, ARM has been getting more and more efficient with big tasks, which has been giving Intel (and poor poor AMD) a run for their money in more and more markets. Intel has reacted by trying to get x86 into smaller and lower power packages. Overall, this means that we have gotten to see end-user performance stand still, but the machines that do these workloads over the last 5 years have shrunk from 1" thick to mere millimeters!
5) Lastly, improvements in performance appear to be in mixed architecture setups. Soon we will see computers with a mix of ARM, x86, and GPUs all working together on the same pool of data as a single intelligent system. This is already happening where the x86 CPU does all of the general compute, and ARM chips run the various drives and add-in cards in the system. The difference is that all of these will be on the CPU, running your OS on the ARM part of the chip, and then farming out jobs to x86 and GPU cores as needed.

Anywho, just food for thought :D

CaedenV
Автор

As a computer engineer I must say this video was very well informed. Well done on your research....

mormegil
Автор

If technological advances plateau, would that mean YouTube content creators would stop lusting after more advanced equipment and have to focus more on (gasp!) the content they were creating? If it happens, they can start by looking at The Good Stuff, because you guys always rock, one way or another.

juststeveschannel
Автор

Back in the days when computers had RAM measured in kilobytes, computer scientists took every step possible to reduce the size of a program. The larger the program, the more expensive it is to buy a computer to run it.
I'm thinking that, while consumer programs will likely stay about as they are now, programs meant for industry will start talking about how many cycles per computation the program takes. Because if you have trillions of data points to compute, reducing a program run time by a few hundred cycles could mean saving hundreds of hours of computer time.
I also think cloud computing will continue to grow. We might need a version of Moore's law for the cost of buying time on a cloud based server.

tylerpeterson
Автор

This is a great topic to cover. Please post more like this in the future. Excellent!!!

BeyondBorders
Автор

This deserves much more view I love your guys channel keep it up

stuffandthings
Автор

Legend has it that if you comment fast enough the good stuff will respond

macec
Автор

although i am worried about where things are going to go, i'm also really excited to experience the future of computing and technology!

misterirontoe
Автор

Very interesting look on Moore's Law. Thanks, Good Stuff.

benaaronmusic
Автор

I have been noticing the shift from processing speed to flashier design and apps as well. But as they say when one door closes another one opens, so maybe once we finally reach the end of Moore's law computer experts can focus on making technology even more useful and "smart"

vasilsimeonov
Автор

That second limitation of Moor's law related to the size of the transistor and possible interference of quantum effects that would make its behavior unpredictable should have been further clarified in the video.... but yeah in general that is an informative video... thanks the good stuff

ghassensmaoui
Автор

The issue for the current tech is the threshold voltage for dielectric breakdown of the gate oxide is very near the switching voltage for the transistor. So there are maybe 1 or 2  more steps down in gatewidth before MOSFETs hit their limit.

This is also what has limited the clock speeds, as the gate width determines the maximum switching speed.

The odd thing with Moore's law is every time it looks like there is a roadblock and it is about to fail, a  new way to improve performance has been found.

First BJTs were the transistors they reached their limits,  then the architectures improved for a few years before MOSFETS replaced them. Now MOSFETS are at their limits and architectures and better manufacturing are what's driving the gains. 10 years ago a company could have tried to make an eight core chip and maybe 1 in 10, 000 to 10, 000, 000 would have worked.

The difference this time is we don't know what will replace the MOSFETs, Single atom switches, Graphene, Optical switches, even a transistor based on tunnelling. Nobody knows, but we will know in 5 years time and be using them in 10 years.

hart-of-gold
Автор

Really wish these guys had at least 300, 000+ subscribers.

NovaGN
Автор

There are a few things that can be done to improve the speed of the CPU without decreasing the size of the transistors: 3D stacking (like multiple layers of transistors) but this is incredibly expensive. (The creation of a mask (can seen as a mold for the CPU)(not counting the creation of the design, just going from design to physical product) already costs more than 100 Million dollars (and has to be replaced for each new generation) and 2 times the amount of layers means 2x more masks, ...).
A fresh start: creating a brand new architecture: windows and intel have always found compatibility very important, they keep old functionality compatible. But this means that software companies need to take that functionality into account. Which means even if the functionality is unnecessary by now, they need to keep including it because newer software is build around the fact that the old features exists. This might be a surprising but your 64 bit CPU still has some 16 bit features. Intel tried to clean it up when switching from x86 to 64 bit CPUs. They had created a brand new architecture known as EPIC (Explicitly parallel instruction computing).It worked great but it didn't sell because it wasn't compatible with old software, then AMD came along and just took the x86 design added some more bits, taking all the old stuff with it... And people bought it because it was compatible.
I'm running out of time now, but if anyone is interested I can continue later.

samramdebest
Автор

I know a replacement for Moore's law!
The observation that the number of transistors in a dense integrated circuit doesn't double approximately every two years.
I know, I'm a genious,

cyancoyote
Автор

hmm, the animation on 3:58 does not have anything to do with transistor size. It is a visualization of mosfet conduction at threshold voltage...

auwdioslave
Автор

One thing to keep in mind when it comes to simply making flashier apps or features is that consumer market is not the whole pie. Commercial, industrial, servers, all big money makers and are less influenced by shiney new stuff.

neeneko
Автор

Why does this channel not have more views? Decent blokes. Decent topics

RNRCLEEDS
Автор

Funny how I was aware of the resource crisis and of Moore's Law breaking down but never put both of them together. I think it's high time we all sat down and had a good talk about making a future for mankind when everything doesn't automatically get X % more/better each year.

the_neutral_container