Think for a minute aboutthe thousands of equations, both mathematical and conceptual, that define the natural events you harness every day in your lives as engineers. The simple ones are the first that pop into my mind, not just because their conciseness makes them easy to remember, but because they also convey a sense of elegance. I'm amazed, even decades after first learning equations such as V=IR, e=mc2 , f=ma, and c=2[pi]r, that their simplicity accurately comprehends and explains such complex real-life behavior.
This year, I heard another such equation for the first time. Theo A C M Classen, chief technology officer for Philips Semiconductors, quoted the expression Usefulness=log(Technology) in a keynote talk he gave on the need for, and ways of achieving, higher computing power in silicon. He named this equation the "Logarithmic Law of Usefulness" and also stated its inverse: Technology=exp(Usefulness).
What does this equation mean, and how does it impact your jobs? Usefulness is the perceived value of a new function, application, service, or combination of these, compared with its alternatives and measured by its daily-life importance, interfacing ease, and entertainment value. Technology, of course, comprises the circuits and algorithms youdevelop, with such metrics as Moore's Law measuring progress. How have you been so readily able to absorb exponential technology improvements, in many cases wishing for even more rapid progress?
Classen claims—and I concur—that a linear improvement in usefulness requires that you throw an exponential amount of technological advancements at the problem. He quotes a number of examples:
the fact that system performance improves noticeably only if you add memory in increments of 10 or 100,
the bandwidth required for video telephony compared with speech,
the usability improvements in DOS versus Windows or Office 97 versus previous application suites plotted against the greater CPU, memory, and hard-drive capabilities required to support these improvements,
digital versatile disks versus video CDs,
Global Systems for Mobile communications versus Advanced Mobile Phone Service cellular, and
on-screen TV menus versus TV Guide.
I see similar examples every day in my work: the greater complexity of advanced synchronous DRAMs compared with asynchronous alternatives of a few years ago; the intricate new graphics-chip architectures required to slightly increase frame rate, display resolution, or perceived image quality; and the radically evolving programmable-logic devices and design software now in the planning stages that hope to take system-on-chip integration to the next level.
Classen's Logarithmic Law of Usefulness explains why society's products have so easily consumed the exponential rate of improvement in technology over time. Its inverse explains why you'll continue to need so much technology to make substantial improvements in, for instance, highway congestion, computer-based education, the environment, and health care.
These are trends extending over years or decades. Classen's equations may be of little comfort to those of you currently between jobs or those of you whose stock portfolios have taken a dip due to a momentary "hiccup" in the NASDAQ. But I can't refute the long-term impact of the Logarithmic Law of Usefulness, and it drives home for me the reason that many people refer to the current period as the Electronic Age. It's been an unpredictable era so far, and, if Classen is right, things will get only more exciting. Hang on and enjoy the ride.
Contact me at firstname.lastname@example.org.
No Article Found