Build a PC while you still can - PCs are changing whether we like it or not.

preview_player
Показать описание


Arm CPUs are taking over. Apple Silicon showed us that desktop computers need not be power hogs - Why haven't AMD, Intel, and Nvidia done the same, and would you want it?

FOLLOW US
---------------------------------------------------

MUSIC CREDIT
---------------------------------------------------
Intro: Laszlo - Supernova

Outro: Approaching Nirvana - Sugar High

CHAPTERS
---------------------------------------------------
0:00 Intro
0:53 PCs are great, but...
2:05 They still run on old tech
2:53 Intel and AMD are in trouble, but that's a good thing
5:27 Contracts, monopolies, and Qualcomm's failure to compete
6:32 PCs have already taken the first steps to change
8:14 Arm's advantages are integrated
10:10 Arm chips can go bigger
11:19 Conclusion - This is going to suck for enthusiasts
Рекомендации по теме
Комментарии
Автор

This all sounds like absolute hell for consumer’s rights, repairability, upgradability, and overall variety in the PC space.

How
Автор

Hearing this, I imagine a dark future where PCs are handled like phones today. "Sorry your 2 year old PC is now irrelevant because we're not giving it anymore updates."

Lossy
Автор

my sole complaint about SOC systems is how frequently the makers of them seem to not give a shit about long term maintenance, expecting you to just buy a new model rather than maintain or reconfigure your existing machine.

HunterDrone
Автор

I feel like the goal for them is to turn computer ownership into phone ownership. You buy a new box every year that you don't own, just lease from that manufacturer.

Also this is unrelated to the content but man this guy is a good speaker. Great to listen to.

HonkeyKongLive
Автор

While I can see this change as an inevitability, I just hope the serviceability and upgradeability will not be impacted as hard as I think they will be

nabusvco
Автор

Sometimes I feel a little old-fashioned for still using a tower for my main computing when I'm not much of a gamer. But I still really like having one because of how I can customize it. I love being able to swap out individual components, or even adding new ones. You really can't mix and match with a laptop, and especially not a phone.

BorlandC
Автор

I'm watching this on a gaming pc I built literally 10 years ago. It was probably low/mid range even at that point costing me roughly $700 (in 2012 dollars mind you lol). I upgraded the GPU about halfway through that time for ~$200. Couple years ago I put an after market CPU cooler in for another ~$50. I've only just now started to run into games that my system *can't* run. I'll confess a lot of newer games I have to run on low to minimum settings but it can. Some of the newest games will start to cause heat problems after a couple hours of play even on low settings. But come on in that same time period I've gone through 5 laptops that I've used for little more then word processing. I'm on my third roku stick in two years because the first one died and the second one just wasn't being supported anymore. I'm personally terrified of the PC market going the way of consoles or all-in-ones.

hourglass
Автор

I believe regardless of how efficient ARM and it’s related technologies can be, there will always be a demand for individual components.

GamebossUKB
Автор

My issue with the idea of the everything chip is say you’re 2 generations down the line and want to upgrade your graphics, but the cpu side is still chugging fine, having to replace the whole thing is not only wasteful, but more expensive. Same goes with a component dying.

Aefweard
Автор

And the great thing about a small, integrated system is that when it breaks you get to buy a whole new system!

Wait...

Fatty
Автор

Linus, I've been in the tech field since 2006. Been watching you since 2007 or so. You have a lot of great staff but Anthony is special. His ability to articulate facts while sounding very concise, his pool of knowledge in the market space understanding the competitive analysis. These are all very strong assets. Treat this man good.

sonoftherooshooter
Автор

Has to be one of the best presenters on LTT, (apart from Linus that is)
give him more screen time, so clear and well spoken

MrDJHarrison
Автор

I won't ever give up the ability to modify my computer.

DickWaggles
Автор

5:30 Anthony, this is a very well put together video, but Apple fails to disclose one key point when talking about efficiency, and most reviewers miss this VERY key fact
Apple is using TSMC 5nm. Where as Nvidia is using Samsung 8nm, AMD is using TSMC 7nm which TSMC says that 5nm offers a -50%- 20% improvement in performance per watt over 7nm, and Intel is using Intel 10nm-'intel7' (almost as good as TSMC 7nm)
To put your comparrison of the 3090 and the M1 Ultra into perspective, if the M1 Ultra used the same Samsung 8nm silicon, the die would be over 4x the size of the 5nm M1 Ultra, and use as much as 12x the power to get the same performance(edit most likely it would only use 6-8x more power).
Samsung 8nm has roughly 44 million transisters /mm² where as TSMC 5nm has ~186million /mm²
To put it another way, the 3090, ported to TSMC 5nm, would be less than 1/4 the size, and might use as little as 80w, and it would be only ~50% larger than the base model M1 as it only has ~50% more transistors than the base model M1

ARM is only "vastly more efficient" than Intel processors that were stuck on 14nm for 6 years.

Apple, on paper, is less efficient than AMD. But we'll have to wait for 5nm Zen4 to get official numbers.

I fell into this trap with my M1 Mac Mini. I thought i was getting something i could dedicate to transcoding recorded TV shows from MPEG2 to H265. But it turns out, i'd have been better off getting a cheaper AMD based system with a 4750u/4800u.
My work's laptop(Thinkpad L15 Gen1 with 4750U) is not only slightly faster at transcoding, it is also more efficient than my M1 Mac mini
Here are the numbers for transcoding an hour long news segment using the same settings on both devices, and yes, i was using M1 native appliications.
M1 Mac Mini : 1 hour 9 minutes, 33Wh from the wall
4750u Thinkpad: 1 hour 4 minutes, 28Wh from the wall
What is crazy is that not only is the 4750u more efficient, but it does this while having to power a display, loud micro-fan, and doesnt have any of the high efficiency integrated parts, instead using a removable M.2 and removable RAM.

Remember TSMC has stated that the 5nm process used for the Apple M1, has a -50%- 20% performance per watt increase over the 7nm node of the 4750u. So, Apple should have used less than 2/3rd the power, and realistically should have used 1/2 the power because the Mac Mini has fewer perifferals to power, but instead Apple used more power to complete the same task.

denverag
Автор

Yesss. Let's put the entire system on a single pcb so we need throw everything away when one part goes bad

lec_R
Автор

It sounds a little like going back to the Commodore 64 era, where, for the most part, everything was on the CPU (with some exceptions). It's obviously a big problem for system builders and the general PC market as it will restrict consumer choice - although it doesn't have to be that way. I think the PC market still requires some custom builds and adaptability so if you can replace the SoC without having to change everything then that would be a big win. Although I really see it was replacing the entire board/system - so your PC would be little more than a SoC board (which you must full replace) with a custom case.

From the chip makers point of view this has nothing to do with power or economy. They want to lock you in to a SoC, keep you there and then just drop support a couple of years later and force you to upgrade and spend more money - this is what it comes down to, restricting choice and forcing upgrades, exactly the same as mobiles and tablets. If you can't afford to upgrade your SoC you just have to suffer or be cut out when support is dropped, that's it. Along with the fact that you can't replace any individual failed components. No, this is nothing more than rampant capitalism and milking consumers for every last penny.

x86 could be redesigned to be more efficient and have a RIS too. There's no need to go to SoC, aside from $$$!

Rasterizing
Автор

Anthony has honestly been my favorite addition to the LTT crew, a pleasure to watch. I'd take a computer class if he taught it.

LongHaulRob
Автор

They've been saying this for like 20 years now. First when laptops went mainstream, then with tablets and smartphones, then when the original NUC came out.

Hotrob_J
Автор

At 47, I’m used to hearing that the traditional desktop form-factor is dead. I don’t think so. I’d also be careful with assuming closely coupled system modules (aka MCM’s posing as SOC’s) are the sole optimization route, as that’s true for Apple and the ARM universe as load-store RISC style ISA’s are highly sensitive to memory subsystem latency issues. CPU core-wise, they achieve great efficiency, but flexibility is highly limited, and scaling exotic architectures gets expensive and difficult. But Apple silicon, and the other mobile SOC-style producers are stuck in a “when all you have is a hammer” situation.

Apple’s main business is mobile, the Mac business represents ~12% of their revenue, versus mobile devices at 60%, services at 20% and accessories at 8%. The desktop portion of that Mac business is minuscule. Through simple necessity, their desktops are going to follow the patterns established by the rest of their company’s hardware designs. That’s their business model driving design decisions, but don’t assume those same decisions work for Intel, AMD, etc., because they probably don’t.

Also, the Mac as a platform has always been defined as a sealed box, no tinkering allowed, especially when Steve Jobs or his acolytes have been in charge of the platform. The expandable “big box” Mac’s have been the exception, not the rule. The Mac and the PC (defined by its open, slotted box roots) are two very different platforms philosophically. I don’t think you’ll see closely coupled system modules replacing big honking discrete GPU’s, sockets dedicated to big discreet CPU’s, and slotted RAM and PCIe slots for the desktop and workstation workloads that demand “bigger everything” (gaming, workstation, etc.).

IMHO, you’ll see more chiplets (more closely coupled) in each of the compute buckets (CPU & GPU) and you’ll see a lot more cache, but the principle box with slots isn’t going anywhere. Where you will see more SOC-style system module solutions is in the laptop space. However, that’s just an extension of a pattern that’s existed for a long time, it’s just that Intel’s iGPU’s and interest in closely coupling memory has been limited by their being generally lazy. Keep in mind, the vast majority of all x86 “PC’s”, both in laptop and desktop form, already (poorly) implement closely coupled GPU (mostly on-die), memory controller, cache hierarchy, etc..

TL;DR : I doubt the desktop form factor, sockets, slots and all, isn’t going away. This all seemed a bit click-baity.

smakfu
Автор

I am sure there will be forever a huge audience for modular pc builds

ekdavid