Slumping Computer Sales: I Guess I’m Part of the Problem

Having a look at this sales chart on Statista.com, it’s clear that computer sales have slid since I last upgraded my desktop PC back in 2013. The uptick in sales this year is probably due to AMD becoming competitive in the CPU industry again, as they have dramatically increased the number of cores/threads per dollar, especially in the common desktop and laptop market. I mean, there’s very little the average home user can’t do with a $100 USD AMD Ryzen 3 2200G APU. Indeed, with everything except RAM, Nvidia graphics cards, and many Intel CPUs, value for the dollar has gone through the roof. 500GB of storage on a solid state device for less than $200 CAD? Crazy! 1TB desktop hard drives for $50? How I cry thinking of the $250 I once paid for a 20GB hard drive… Also, laptops break and people tend to buy less powerful laptops than what they end up needing down the road, so that helps drive the sales in this chart. Looking at this chart, we can see that laptops outsell desktops 2:1, which is an eye opening divide considering that these sales figures include office PCs.

All things considered, it’s not a bad time to buy a new computer.

So as a “semi-nerdly” computer enthusiast, why haven’t I? Obviously the whole “I’m a grown up with responsibilities”, and “I’m not going to go into debt to do it”, factors at play, but the major underlying reason I haven’t upgraded ye o’l desktop is because… I don’t need to.

When I look at everything I actually do with my desktop, turns out it does all that stuff just peachy keen. And when it comes to things I would like it to do, “MAKE BIGGER PICTURE!” was about all I felt it needed. Having already replaced my aging keyboard, my broken mouse, and my disk drives as part of regular maintenance, and having picked up 4GB more RAM when it was on sale, the smallish screen was about the only aspect of my desktop that I felt needed attention. Now that my lovely wife has given me a 24″ 1080p IPS monitor for our 13th anniversary, I can thank my old 20″ 900p Samsung monitor for its decade of excellent service and gleefully frolic in the land of “full HD” for years to come.

Tangent: I tried a 27″ 1080p screen, but it was strangely “too large”. I think 27″ would have been fine at 1440p, but upgrading to a 1440p monitor would also mean either buying an Nvidia 1070 class graphics card or running games at lower resolution than the native 1440p (which tends to make user interfaces in games blurry). Honestly, I would rather go the other way and downgrade my GPU to one that doesn’t require a fan or to an APU so the “graphics card” can share a giant, quiet aftermarket cooler with the CPU. I value silence more than frame rates in the few games I still play.

Update: I returned the monitor today. At the risk of sounding old, “they just don’t make them like they used to”. From my research after noticing the problem (the instant the desktop loaded), it so happens that the panel of the 24″ Acer SA240Y was manufactured (by LG) with red and blue sub-pixels that are thinner than the green sub-pixels, which causes vertical lines in all shades of orange, yellow, and blue. Not only was this distracting, but as a content creator the issue made it impossible to correctly judge the colours I was creating. Can’t have that! Ah well, maybe I’ll use this perfectly excellent looking 20″ Samsung Syncmaster 2033sw for another decade – honestly, I just wish it was 1080p and a little bigger…

“If it ain’t broke, don’t fix it.”

I think that’s just the plain truth of the matter for a lot of people. Most people have a smartphone that allows them to achieve a significant amount of their computing tasks. A good amount of homes already have a computer that is less than ten years old, which is good enough for pretty well all productivity tasks as well as most games, even the latest games at low details. There just hasn’t been enough growth in the software industry to necessitate having a more powerful computer in every home than what’s already there. Is faster better? Sure, provided that “faster” is being achieved by simply throwing more hardware and electricity at the issue (as we see in with graphics cards and high-end desktop CPUs). But, how much of a practical improvement does that “faster” actually make? Is it worth the effort and the money? For me, it’s not, because my old machine is still roughly the same as an new entry level gaming desktop.

Happily coasting through the digital cosmos with my ancient PC,

A picture of stuff and junk

My semi-nerdly hovel…

For some much needed context, here the specs of my desktop and a list of things I use it for…

System:
CPU: AMD FX-8320 @ 4GHz
GPU: AMD R9 270 2GB GDDR5
RAM: 12GB DDR3 1600MHz
Motherboard: ASUS M5A97 R2.0
SSD1: 120GB Sandisk SATA (Linux, Devuan 2.0)
SSD2: 240GB SK Hynix SATA (Windows 10 Home)
HD1: 500GB Western Digital (Windows 7 Home)
HD2: 1TB Western Digital (Linux Storage)
CD/DVD: Samsung DVD-RW
A/V Input: KWorld PCI TV Tuner card
Power Supply: NZXT 650W
Monitor: ACER 24″ IPS LCD
Keyboard: Razer Blackwidow Ultimate 2016
Mouse: Logitech M510
Case: Heavily modified AT server tower

Purposes:

  • Boring computer stuff, like reading, web browsing, word processing, spreadsheets, media playback, etc.
  • Playing games like Guild Wars 2, Torchlight II, Elite Dangerous, Star Wars Galaxies, Banished, WoW Mania, AstroMenace, Alien Arena, SuperTux2, SuperTuxKart, Frogatto…
  • Program games using C/C++, Lua, JavaScript, Python, virtual machines (VirtualBox), Blender, Tiled…
  • Create and edit raster graphics using GIMP…
  • Make songs and sound effects using Sunvox, Audacity, Rebirth, and my sound board / TV Tuner setup for recording guitar, etc…
  • Manage our family’s picture and video archive…
  • File and software management, zipping/unzipping/installing stuff…
  • Some basic video editing (not really my cup of tea)…

Of the things that I do regularly, about the only noticeably poor experiences are when my frame rate tanks in Guild Wars 2 when the local area is very busy with other characters (which happens even on better machines) and when booting Windows 7 from the hard drive that it is activated on. Transcoding/encoding video and applying filters to very large images in GIMP are also slower than I’d like, but I do those things so infrequently that it doesn’t matter. I’d have to buy a $210 CAD CPU (+RAM +MB) to see a real improvement in the editing and a $350 CAD GPU to improve the performance of 3D games, but it doesn’t feel to me that I need to do so. The downsides to this computer simply don’t bother me enough to make me feel like upgrading.

Sure, I have spent countless hours pouring over tech websites and online shops, looking at ways to upgrade my desktop, but the reality is that I don’t need to upgrade. Yes, the large core count and excellent performance for the dollar of the AMD Ryzen line of CPUs (especially the R5 2600 CPU and R5 2400G APU) are temping, but it’s just money I don’t need to spend, because ultimately I don’t need the extra performance either. The breakneck speed of computer hardware and software growth of the 70s, 80s, and 90s is over. Today we live in a time of “samey” software and incrementally improved hardware that does little entice people to upgrade their existing systems.

I used to think that one day something would come along that my computer couldn’t do and that I just couldn’t live without, because that’s how it always used to be. However, I am starting to think my next big “computer purchase” will end up being a gaming console like the Wii, with all its crazy family exercise related accessories. Wait a minute… we sold our Wii, because we were only using it for Netflix… OK, ya got me, I don’t know if I’ll ever feel the need to buy a computer better than the one I already have!