These are the facts:
1. CPU speeds are stalled.
2. Vista will run software slower.
3. The current workaround for speeding up most software is with multi-core CPUs.
4. Most software cannot be re-written for multi-core.
5. Even if it can, memory access will probably stall the software.
6. Going to 64-bit CPUs only helps software that works with extremely large datasets.
7. Going to high-speed GPUs (graphics boards) only works with graphics-intensive software.
Read it and weep: What's the story with Photoshop & multi-core? by John Nack of Adobe.
in Why Software Will Run Faster I explain why I think your 7 facts are correct, but the conclusion isn't.
Posted by: Philippe Guglielmetti | Dec 28, 2006 at 01:30 PM
You forgot one thing as to why software won't run faster anytime soon:
Dumb software design. Particularly in regards to new whiz-bang features of questionable value.
Case in point: Autodesk's insistence to use plain-text XML to store data when it probably should not. For example, with AutoCAD's CUI files, which are orders of magnitude larger than the tiny, fast, compiled binary MNC files which they replaced.
This leads to doggedly slow performance in the program and CUI editor as it has to load and parse these multi-megabyte files. Worse, the schema design promotes bloat, as it is simple to add massive amounts of data to a CUI (by simply dragging a toolbar from one CUI to the next) but there is no "purge unused command" facility.
XML is also used in large doses in the other parts of the UI, particularly Tool Palettes. They are very slow to respond, not to mention how easy they are to break.
If we can solve the dumb design aspect of how software is written, focusing more on performance and tuning rather than the prevalent "It's new so it must be good" mentality, I think we would be a lot better off.
Posted by: Matt Stachoni | Dec 28, 2006 at 11:15 PM
I have to laugh, but please note, this is from a 'dumb user' Point of view, not a technical level: Last week, i was using one of the old, nasty Pentium 6 Windows machines in my office (with a 20 GB hard drive, please note) and clicked on Photoshop 5.5. Imagine my surprise when Photoshop appeared literally within 5 seconds. I was cropping images and saving them to the server within seconds.
It was massively fast, even compared to Photoshop CS 2 on a prety nifty (intel) Mac, with 2 GB of RAM.
Then i called up Acorabt 5.x which is still on this system. Again, it appeared in no time at all (compared to the 30-or-so-seconds required for Acrobat 6.)
Staff at the office ask why i don't throw the old machine out. After all, the hard disk is essentially full. There are no spare resources on this thing! But I actually find it fun to use the old machine, with old software, that initiates at my command. It was a huge novelty.
But is that too much to ask? Why undergo the software bloat, when all we really want is to have applications at our fingertips at every turn? When you observe how fast the old applications are (on old and admittedly nasty machines), you have to think that maybe the developers are taking us for a ride.
rach xx
Posted by: Ratchbrat | Jan 03, 2007 at 12:06 AM
Rach xx may have made a bogus example above. Comparing that speedy old Pentium 6 machine with Photoshop CS2 on a speedy new Intel Mac is extremely distorting. Complete false impression.
CS2 is only written for PowerPC-based Macs and runs on Intel Macs in Rosetta (Apple's PowerPC emulator) and runs at about 85% native speed at most).
That's why the CS3 beta is such a big deal to Intel Mac owners. Anyway, yes older machines running older software are often surprisingly speedy. Mac OS 9 on some of the newest hardware is incredibly quick outside of the OS X environment.
Posted by: Anthony Frausto-Robledo | Jan 03, 2007 at 03:09 PM
Actually, there is an 8th factor inn comp[uter speed, and it is probably one of the most significant. It is hard drive speed.
Both Windows itself and most current applications seem to be making more and more use of the hard dive, presumably for things like memory swapping and Undo history. Yes, CPU speeds have stalled but hard drives have even more so due to basic laws of physics involving the moving mechanical bits. I would guess that doubling CPU speed might only give a 10-15% overall performance improvement.
A while ago I ran a copy of the speed analysis program from a very old copy of the Norton Utilities on my current dual-core Pentium machine. I don't know how much faster the CPU performance was, because it literally blew off the top of the scale.
The hard drive, on the other hand, was only about twice as fast as the rating standared, which was a 20mb Seagate (note "m", not "g").
On the other,other hand, I also have a Toshiba T1000 laptop. It is a 4.7mHz (note "m", not "g")XT class of machine, with 512 kb RAM and a 768 kb (note "k", not "m") battery-sustained virtual RAM drive.
DOS 3.1 is burned onto a ROM chip, so it doesn't need to load from a drive. I can go from power on to running WordPerfect 5.0 in under 5 seconds (note "seconds", not "minutes").
Posted by: Bill Fane | Jan 04, 2007 at 01:29 PM
Bill:
I agree that hard drive speed is a factor. The change was noticable when I installed 10000rpm drives a couple years ago.
In contrast, notebook hard drives run sometimes as slow as 3500rpm -- too slow to be able to edit video and keep the audio synch'ed.
I installed a RAM drive on my first computer, a diskette-drive Victor 9000, from which I ran WordPerfect's spelling and thesaurus checkers -- no waiting and that was 20 years ago! (Of course, the 1MB RAM for the RAM drive cost me over $500.)
Posted by: ralphg | Jan 04, 2007 at 02:21 PM
Well, someone pointed out that the link above isn't valid anymore. Why Dostware Will Run Faster is now here
Posted by: Philippe Guglielmetti | Apr 03, 2008 at 11:16 PM
The "next step" seems to be web apps - that'll set you back a decade or so ;)
I often see a pattern I call "tick box programming". You may be able to perform a certain operation using a certain software but it may not be anywhere near the smartest way. I believe that performance comes behind usability. The fundamental question is: "what sells?".
Have you ever seen a benchmark comparing different programs performance at different number of cores? If not, how about comparing AutoCAD and the latest IntelliCAD clones?
Posted by: Henrik Vallgren | Apr 07, 2008 at 05:30 AM