Why do COMPUTERS get SLOWER with age?

preview_player
Показать описание
Dave explores why computers slow down the longer we use them.
Рекомендации по теме
Комментарии
Автор

Also security patches often also makes things slower.
The variant of Heartbeat that was vulnerable to heartbleed was _definitely_ faster then the patched variant. Because the patch was a extra check.

christopherg
Автор

In my opinion Wirth's / Andy and Bill's law also plays a major role. When I look at the code that my colleagues write performance seems to be the last thing on their mind because modern hardware can often compensate for bad code.

Fiyaaaahh
Автор

I find even web browsing gradually gets slower as the browsers are updated and the web technology changes. It really does feel like we are not really getting any more for the extra processing that is required. In fact, my view is we are getting less functionality.

Error_
Автор

OSs are not supposed to be giant octopuses, with every tentacle wrapping around your computer, like accessing the disc every second. Everybody wants to use all the resources to show off with huge graphical interfaces! I remember In the nineties, mouse drivers used to be a few Kb in size, now they are tens of megabytes. And don’t get me started on all the massive printer installations I’ve used over the years.
Computers don’t get slower, OSs get more rubbish. It’s all designed to make you buy more hardware.😃

DaveHoskinsCG
Автор

In the Windows XP days, my solution to the problem was simply wiping out the harddrive about once every year or 2 and simply reinstalling windows. That always worked like a charm, fresh and clean. I still do that now with my windows 10 and 11 machines, works like a charm.

Linux I dont have to worry about because, well...Linux

opticalghost
Автор

As computers become more powerful, later software versions become slower to use the extra power. It's the Parkinson's Law of computers: Software expands to consume the available cycles.

ihbarddx
Автор

Intel has had some security problems that required patches that have been devastating to their speed.
Downfall~Meltdown~Rowhammer AMD has a bunch too. The patched do slow down your computer by a good amount too. 30%?

jeffkili
Автор

Windows 95/98 suffered from what some folks called "Registry Rot"/ The more software you installed and uninstalled, the more unstable and slow Windows became. And why does windows volume shoot up to 100% every time you install new audio hardware??

KabukeeJo
Автор

Disk retries increase over time, mostly due to physical wear of the disk coating, or flash cells but partially due to aging of the circuits that deal with the extremely low voltages. NIC retransmits increase over time due to signal quality loss from aging of the power components. Moisture ingress into the PCB will mess up impedance matching, causing echos of data values to increasingly reflect around, which can cause PCIe retries.

douggale
Автор

Don't forget general software bloat. Not that a guy from Microsoft would know anything about that.

harrkev
Автор

As someone that lived through the dark days before SSD's I rarely complain about slow speed. If someone told you they cried with joy the first time they ever booted up their machine with an ssd they're probably not joking. When you turned on your computer was usually the time you went to get yourself some coffee or use the restroom. There was debates on if it was okay to just leave the computers running but honestly the debate didn't matter because we'd just leave them on so we wouldn't have to wait turning them back on again.

MrBrassporkchop
Автор

Too many comments to check if someone mentioned it, but some reasons for gradual decrease in performance are the ever increasing size of log files that don't get rotated, software updates which include new features thus taking longer to load, and in some cases disk fragmentation. In most cases the issue is related to increased disk I/O as that is noticeable, but installing more and more programs over time which keep some background processes running can also slow down the computer enough to get a sluggish user experience but it's quite difficult to end up with 100% CPU load that way so it's usually virus scans, search indexing, and poorly written software that end up maxing out the CPU.

VeniceInventors
Автор

You should look into issues with SSDs too - with multi level storage of bits - where when empty, the storage uses 1 bit per ‘storage cell’ and can read and write quickly, but once the drive reaches half capacity, it starts storing multiple bits in the same cell (not a simple on/off but varying voltages to correspond to 4 different binary values), and that makes reading and writing those bits slower. (Recent episode of Security Now Steve Gibson dives into this)

We used to have similar issues with spinning disks getting slower over time - sectors failing and needing multiple read/write attempts (below the OS level) as well as fragmentation leading to slower overall performance.

There are actually quite a few ways in which the hardware itself can lose performance - another, you could have RAM that fails, and instead of having all the memory available, the PC might only ‘see’ some of it and be having to cope with it. Yes, usually this would be detected, but RAM can fail in ways where the PC simply doesn’t see it plugged in, so fails silently to the user.

sputukgmail
Автор

This is a very simplified explanation.

Honestly we all know that before NTFS became the standard, a defrag could make a huge difference, not least on HDDs of the time, as opposed to the SDDs that are the norm today.

That's the hardware side... software side is also guilty.

I do microcontroller programming, and today SoCs have vast amounts of resources compared to my first PC, an Amiga 500.

Having an environment where you don't need to count memory in kb and clocks in kHz makes a programmer lazy.

Those that still do the counting are usually hobbyists that challenge themselves. Back in the day, a programmer that could save 1 out of 3 bytes of memory, doing clever bitwise operations, was king on a system with 256kb of RAM.

Today it's more like "what does 100mb matter if we can save 30% on development time and 50% on maintaining the code?"

This slack approach to memory, storage and CPU cycles does have an impact on whole systems over time.

Remember when it was possible to squeeze an emergency/recovery linux onto a single 1.44mb floppy? So you could try to repair a broken partition table, or remove a faulty entry in the fstab file. The kernel alone can't fit on a floppy today.

BenjaminVestergaard
Автор

I wrote some ATA and SCSI drivers. Sector read performance becomes terrible once the firmware starts using spare sectors.

Also I've noticed tons of lazy programmers don't cleanup temp files when they are done, which slows things down. I wish windows had a "cleared on boot" temp folder I could use.

Username
Автор

Also, your device might be underclocking to stretch the battery life long (Apple...), updates and software are tested on the most modern components, and they do not typically optimize for older systems (they had to balance developer time against the computing resources of the client devices). Some components (like hard disks, as another commentator explained) do wear out - the computer navigating around these issues can also slow down the computer. The biggest issue though is that software isn't made targeting your device.

bobthemagicmoose
Автор

If you need episode ideas, I'm curious how basic is turned in to machine code or just "stuff that works". It's related to the early computers you have, it all seems like magic how a small machine runs it. When the memory is kilobytes how do you interpret in to useful code.

notmarhellnem
Автор

There is a psychological aspect though to. Like using an old pc back when it was new felt fast. But as tech got better it felt slower and slower. We used to turn pcs on and leave the room as they started up. Or click a game then leave the room. And those pcs we thought were fast at the time. Today you would never put up with that so it now feels crazy slow

theendofit
Автор

A problem pointed out by IBM some decades ago. OS/2 initially spread files around on disk, leaving gaps around file. As old files expanded, they could expand into empty space nearby. In Windows, files were allocated adjacent to each others. It's not clear to me that defragmenting the disk was good. You got a big lump of fully allocated space and then a big lump of unallocated space. When an existing file expanded, the next space was quite some distance away.

Not a problem with SSDs I expect.

I don't think software updates or new features or even new software caused much slowdown. Firefox (and probably Thunderbird) long had memory leaks which my use pattern exacerbated, but the Mozilla crowd wasn't very interested. That could bring a Pentium III to its knees.

oneeyedphotographer
Автор

There's nothing like a fresh Windows install every now and then. An easy job when the libraries with documents, video's pictures, etc are stored on a drive other than C:\

UmVtCg