Design Con 2015

30 years of DSP: From a child's toy to 4G and beyond

-August 27, 2012

As Texas Instruments principal fellow Gene Frantz tells it, the aha moment for the company’s DSP pioneers came in the late 1970s, shortly after TI’s seminal Speak & Spell learning toy hit retailers’ shelves. Frantz recalls that a customer asked, “If you can [use DSP to] add speech synthesis to a toy, what else can you use it for?”

This year, as TI celebrates its 30th year in the DSP market, that long-ago question has been answered many times over. Without DSP and the advances it has enabled in audio, graphics, and multimedia processing, there would be no “infotainment” content, no smartphones or tablets, no Internet or ecosystem of apps.

TI’s “toy” technology not only moved the company into a new business but set the stage for developments by TI, its competitors, and tool vendors that have pushed DSP technology into diverse applications and markets. At the same time, traditional DSP devices have seen competition from an array of alternative signal-processing platforms, including CPUs with DSP-oriented features; digital signal controllers, which pair a DSP core with an MCU; FPGAs, which can be used to design custom data paths for digital signal processing or even to create custom programmable processors; and, most recently, massively parallel processing graphics processors that can tackle data-parallel problems.


Figure 1: TI’s Speak & Spell team—from left, Gene Frantz, Richard Wiggins, Paul Breedlove, and Larry Brantingham, showing off the product at its introduction—went on to push DSP technology into diverse applications and markets.

The roots of DSP technology predate the Speak & Spell by several years. In the early 1970s, scientists began using off-the-shelf TTL discrete logic chips to implement specialized signal-processing “engines.” The early systems were relatively slow and consumed a lot of space. TRW shipped the first practical parallel multiplier design in 1973 and added bit-slice ALUs two years later. But at several hundred dollars just for the multiplier chip, the only customers that could afford such a product were research laboratories, medical-scanning equipment makers, and the military.

In 1978, American Microsystems Inc announced the first single-chip IC designed specifically for DSP: the 12-bit S2811. AMI devised a truly innovative circuit design but implemented its chip in a radical “V groove” MOS technology that never yielded volume commercial products.

The following year, Intel Corp introduced the Intel 2920 16-bit “analog signal processor,” so called because Intel had designed the chip as a drop-in analog-circuit replacement, complete with on-board A/D and D/A converters. The 2920 processed analog signals digitally, but it lacked a parallel multiplier; what’s more, its 600-nsec cycle time made it too slow to perform useful work in the audio spectrum, where the first high-volume DSP chip market would eventually materialize.

The first “true” single-chip DSPs—which market-analysis firm Forward Concepts defines as having parallel MAC (multiplier-accumulator) circuits—emerged in early 1980 from Bell Labs and NEC. The Bell Labs chip, the DSP-1, was a captive device used in AT&T and Western Electric equipment. NEC’s µPD7720 was the first true single-chip DSP shipped in volume to the merchant market. Although hampered by primitive development tools, the NEC chip offered sufficient speed—a 122-nsec cycle time with a two-cycle MAC—to perform useful work in the audio spectrum.

In the late 1980s, Hiromitsu Yagi of Ricoh redesigned the original AMI S2811 chip for a conventional NMOS process. Yagi’s work resulted in the Ricoh RD28211 and the AMI S28211.

Next: Title-1

Loading comments...

Write a Comment

To comment please Log In