Perceived value: ally or antagonist?
By Brian Dipert, Technical Editor - September 20, 2001
At some point in your career, have you ever been on the receiving end of a pitch from the marketing department, which tried to convince you to add some seemingly insignificant feature to your next system design? Or, should I more accurately ask, how many times have you been in this situation? Has marketing gone mad? Or have you overlooked something? Perhaps it's a little bit of both.
Take, for example, the various approaches that prevent buffer-underrun errors when writing to CD-Rs and CD-RWs. Sanyo's Burn-Proof (www.burn-proof.com) first found use in drives such as Plextor's (www.plextor.com) 121032A (Reference 1). After initiating a write, the BurnProof-enhanced CD-R/RW drive monitors the status of its buffer and, when it judges that an underrun error is eminent, switches off the laser and archives the location's EFM (eight-to-fourteen modulation) pattern. After buffer refill, the drive accesses the previously written area of the disk, compares the data with buffer contents, and, when it finds the last recorded point, restarts the laser.
Even in its initial implementation, BurnProof works great; I'm unable to create a "coaster" (ruined disk) no matter how many other system-resource-intensive applications I run while I'm writing. It also provides a potentially cost-reducing step for drive manufacturers, which can now include lower density buffer memories in their products. So, why is BurnProof technology in its third iteration? Credit Ricoh's JustLink (www.ricoh.co.jp/cd-r/e-/e_asia/drive/justlink.html). As a result of its laser disable and re-enable, first-generation BurnProof created an approximately 40-micron sector-to-sector gap in the written data stream. Well within Orange Book CD error-detection and -correction capabilities, the gap is nonetheless longer than JustLink's 2- to 4-micron gap. In response, Sanyo (www.sanyo.com) shrunk the gap to 15 microns in second-generation BurnProof, and the third generation essentially eliminates the gap.
Or take AGP (Accelerated Graphics Port), which the computer industry developed to move graphics data off PCI and onto a dedicated high-speed link between core logic and the graphics subsystem. AGP advocates envisioned a near-future era in which the entire operating system and application GUI would be 3-D, high-resolution, and color-rich. In reality, the GUI is still 2-D, and, especially with LCDs, you probably don't have much motivation to run the display at anything higher than 16-bit color. Few users configure their screens for resolutions higher than 1024×768 pixels, because if they did, icons and text would shrink to imperceptible dimensions.
Applications for 3-D technology, beyond games and niches, such as digital-content creation and CAD, haven't emerged. And, except in high-end software and system configurations with high levels of colliding network and peripheral traffic, plenty of average and instantaneous bandwidth is available on PCI for graphics. Reality aside, AGP has essentially obsoleted PCI-based graphics. The Internet newsgroups I monitor are full of participants overlocking AGP and struggling to enable its optional features, such as sidebanding and pipelining, in their searches for ever higher frame rates whose benefits, ironically, are imperceptible to their eyes and brains. And the AGP Implementer's Forum (www.agpforum.org) is busy working on the AGP 8× specification, with the goal of again doubling the sustained bandwidth of today's AGP 4×.
What's the point of these and other seemingly useless features? In some cases, they're simply the result of a less-than-perfect crystal ball; folks will eventually need them, but marketers have overestimated how quickly hardware and software developments will create that need. In other cases, they're the result of marketing's need to list more features on the spec sheet for this year's widgets than for last year's widgets to convince last year's customers to continue their patronage and to help differentiate your company from its competition. Is this tactic lying? Sometimes yes, and eventually customers will figure out this ploy to your company's detriment. But if this approach is valid for even a small subset of your potential customers, now or in the future for delayed obsolescence, I'd prefer to call it more "exaggeration" than "untruth." At that point, the decision of whether to add the feature comes down to who's the better negotiator: you or marketing?
Remember, product sales are the ultimate source of your paycheck. Sometimes, a feature adds more cost than it's worth and can't justify itself in terms of increased sales or profit margins. Sometimes, though, you might have to bite your tongue and do a little more work than your analytical mind believes is required. Ally or antagonist? Which one is the latest feature request to hit your desk? Good luck determining the difference!
Share your thoughts.
Currently no items