Now! Slower than 1GHz!
2014 was the year the computer world began running backwards.
Those of us growing up with personal computers in the 1980s and 1990s experienced ever increasing speeds. Some of us remember desktops with TurboBoost! buttons that sped up the PC from 4.7MHz to 10MHz. (The reason for the button: some software was written for 4.7MHz in mind, and would not work properly at other speeds.) My policy was to buy a new desktop computer anytime a newer CPU was at least 3x faster.
The last time I bought a new desktop was quite a few years ago. Intel can't get its chips to run faster than just over 4GHz. The silver lining: no need to buy a new computer any more!
As replacements for Intel's failure to meet our expectations, ReadyBoost was helpful in speeding things up on Windows computers, as does replacing hard drives with solid state drives on any computer. Ha. For the price of a new computer, I could put in a memory-based hard drive.
Last fall, I needed to replace my miniMac. (Short version of the story: I tried replacing the hard drive with an SSD, and thoroughly screwed up the old miniMac.) Since I need a Mac only occasionally, like running a CAD package for a few screen grabs, I got me the cheapest one.
This latest model has a multi-core i5 CPU running at around 1.4GHz and Turbo Boost up to 2.7GHz, supposedly. (Heh: the TurboBoost button turns into software.) Decent specs, I'd've thot. But slow! It feels like it has a 486 running in the hundreds of MHz. No sense of Turbo Boost every kicking in. Starting OS X and then a CAD program is like the old days, when engineers went on a coffee break to wait out the launch sequence.
Here's the thing. I bought the cheapest one, which also happens to be the miniMac model promoted by the big box stores in their weekly flyers. You know, "Give Mom the gift she loves: a new computer." So this painful experience is what Apple expects new users to suffer through? I guess it makes sense to somebody at One Infinite Loop.
It could be worse, I suppose, but a baulking computer is one you don't want to use. Remarkably, Apple has introduced even slower computers: new MacBooks that run slower than 1GHz. If the miniMac is nearly unusable at 50% faster than this, how bad must the single-ported MacBook in terms of UX?
Why Laptops Are Being Made to Slow Down
The dirty secret is this: computer makers are making laptops slower to juice battery life. High CPU speeds drain the battery. For the marketing department to land their claims of "all day battery life," the engineering department is being forced to slow things down.
Battery life is stalled, just like CPU speeds stalled. We can't compute longer and we can't faster. There is no need for consumers to buy new computers. Hence, apocalypse nearing.
That's interesting. I heard that in 2014, sales of Ipads and fell while sales of desktops and laptops increased-specifically at Apple
http://time.com/3532882/people-arent-buying-tablets/
Posted by: John Burrill | May 23, 2015 at 09:05 AM
While i would agree the speed up has slowed down, it is certainly not going backwards. For mobile devices battery life is important but the IPC of the new chips has improved in every generation, how many ghz don't really compare to each other if you don't account the actual performance per clocktick.
Skylake will add another 15% at the same clockspeeds.
Posted by: Ym | Jun 08, 2015 at 01:42 PM
There are many reasons why speeds are going down but one of them is certainly the end of Moore's Law, which, effectively, has been dead and gone for years.
Moore's Law is about the doubling of the number of transistors in a processor every 18 months or so (the original version was doubling every 12 months). In years past in a world of desktop machines that extra capacity was primarily used to cover the many sins of sloppy software through faster machines. Don't worry about pigging out that code because the machines will be twice as fast in a year.
But doubling the number of transistors also means the same number of transistors can be put onto a smaller and smaller die over time and thus consume less and less power while operating faster and faster. When Moore's Law ended that automatic reduction in power consumption or, for the same power operating faster, ended.
What you see today is just the fun beginning for three reasons: First, programming today has gotten sloppier and piggier than ever before as script kiddies turn to massively inefficient ways of writing code. Second, add to that the tendency of today's billions of abjectly clueless users to hand over their devices to whatever software Google or their "carefully selected" cast of millions of advertising carpetbaggers want to load onto their smartphones means that bloatware, malware and endless other programs will be eating alive whatever processing capacity there is.
The third reason is that with Moore's Law dead the only practical way of keeping speed up is through parallel processing, and writing real parallel code is very difficult. Simply dispatching four different tasks to four different cores ain't writing parallel code, it's just a different way to time-share existing resources in a non-parallel way. Automatically chopping up a single program to run simultaneously on four cores and then re-assembling the results at the end is parallelism, and that's way more difficult.
Posted by: Dimitri Rotow | Jun 10, 2015 at 07:25 AM
Anyone who thinks otherwise is deluded.
My notebook bought in winter 2011, is still more powerful than anything easily available in local stores today. For the same money i spent back then at most I get a 50% bump in performance, even when buying online. Whereas the jump in performance when upgrading my 2007 device was 200-300%
Progress has slowed to a crawl if you what you are looking for is raw processor power. The same goes for desktop processors, its not worth upograding. You get less bang for your buck now than 3 years ago. The retail stores are full off underpowered low energy devices, whereas a few years ago they will full of higher performing less efficient devices.
However, the same cannot be said for GPUs and Mobile devices. They have come on leaps and bounds.
Posted by: Daniel | Sep 02, 2015 at 01:53 PM