Gambling On Multimedia
My two-part blog post late last month on the Spring Intel Developer Forum gave lots of details on Intel's upcoming multi-core processor plans. What it didn't provide, though, was my swag on what I saw, at least to the degree I'd planned. Back then I said, "I'm frankly skeptical of the widespread impact of multi-core, at least in the near term. Why requires a blog post all its own; stay tuned." Well, here's that post.
I figured that with review systems based on the Pentium Extreme Edition 840 and 2.8 GHz Pentium D beginning to show up on folks' doorsteps (and with mine set to arrive sometime this week), it was time to prioritize this particular soapbox rant. AnandTech (see here for part II) and ExtremeTech's reviews are particularly solid, as usual. Unfortunately for both Intel and AMD (who has aggressive multi-core aspirations of its own), they confirm my gut feeling when I sat in Moscone Center in early March and heard the IDF pitches for the first time.
I remember way back in early January 1997, when Intel formally unveiled its first Pentium processors with MMX instructions. I'd just joined EDN, and I'd heard lots of internal buzz about MMX the prior year or so at Intel, where I previously worked. Intel's demos at that year's debut IDF and other public events were full of multimedia-centric applications which harnessed the instruction set enhancements and corresponding hardware acceleration function blocks within the CPU: audio, still image and video encoding and decoding (remember, this was back in the time when MP3 playback could bring a PC to its knees), voice recognition, games, and the like.
MMX 2….SSE (Streaming SIMD Extensions)….SSE2….SSE3….each time a new instruction set tweak was unveiled, Intel would trot out the same applications as justification, and as the processors grew more powerful, the demos attempting to rationalize the need for the generational jumps got ever more bizarre and improbable. This wasn't, by the way, an Intel-only phenomenon; I've seen plenty of head-scratching AMD 3DNow! demos, too.
Now we have dual-core CPUs and, as with the Hyper-Threaded processors before them, Intel's giving us lots of glitzy exhibits. Once again, they're heavy on multimedia. Some of them portray heavy single-user multi-tasking environments. Others envision simultaneous access to the PC by a person sitting in front of the keyboard and by other networked PCs and other gear across the LAN and WAN. Both are realistic scenarios for servers. Neither is realistic for the average business or home client system. And I can't even begin to explain Craig Barrett's bizarre demo at January's CES (replicated at March's spring IDF), involving a motion-activated display as the sole system interface. It looked kludgy, and it worked even worse. Who does he think he is, Captain Kirk?
Quick show of hands; how many of you regularly play Doom 3 while simultaneously recording a high-quality version of a NTSC television broadcast and doing background antivirus, spam and spyware scans? Thought so. If you look at how quiet Intel's been about Hyper-Threading in its consumer-focused promotion activity these past few years, especially considering it's an area where AMD's processors don't currently compete and in comparison to the heavy hype that Centrino has received, none of this should be a big surprise.
The fact that, if you go into the BIOS setup menus of your systems, many of you can disable Hyper-Threading should be another clue. For the traditional single-threaded applications that currently dominate the market, multi-thread-capable hardware can actually decrease overall system performance. This is especially true if it runs at a clock cycle deficit compared to a single-core equivalent (as current dual-core CPUs do, in order to reduce power consumption and heat dissipation).
Games are fundamentally not multi-thread friendly, as another recent AnandTech writeup containing an interview with Tim Sweeney at Epic Games reminds us. Ironically, as the earlier-mentioned ExtremeTech review points out, the DivX encoder isn't (currently) multi-threaded, either. After digital audio ripping (note: nowadays, pulling the data off the audio CD is the performance bottleneck, not encoding it) and playback, ripping DVDs to the DivX format might be the next most common multimedia application!
My issue isn't with the enhancements that AMD and Intel are bringing to the plate in and of themselves. Hardware capabilities have always preceded software demand, something that the graphics folks also know well. But I'm concerned by the ever-increasing percentage of overall silicon area on the die, therefore cost of the die, devoted to these features, as well as the increasing power consumed by all those transistors, as compared to the return-on-investment for the average computer user.
What do you think, Brainheads, am I being overly pessimistic? Or are AMD and Intel being overly optimistic? Is pervasive use of multimedia and, more generally multi-threading, just a slow-to-arrive but inevitable outcome, or does their allure represent AMD and Intel's version of Homer's Sirens? There is, after all, that saying about human beings being as a rule notoriously over-optimistic in the short term and over-pessimistic on the long term. As always, I welcome your thoughts.