All design is analog—some more so than others
By Gabe Moretti, Technical Editor - March 31, 2005
The invention of the transistor made it possible for engineers to use digital logic. This approximation of nature uses binary logic to implement functions that provide both computational and human-interface functions. The processes vendors use in fabricating ICs makes it possible for most designers to achieve success simply by knowing logic design and ignoring the reality of the physical laws governing electrical-circuit behavior. Les Spruiell, director of technical marketing at Xoomsys, contends that the electronics industry owes much of its success to the fact that digital logic is actually an overdriven analog circuit: "Since you can view digital circuits as either on or off, much of the underlying nonlinear complexity is hidden from view into neat little packages of logic. This simple concept has allowed design teams to enthusiastically follow Moore's Law to the letter. Recently, designers have begun to suspect that something is seriously wrong. Their neat little packages of well-behaved logic have become huge unruly monsters fraught with all manner of strange behavior."
Today's designs are increasingly complex, requiring more functions and decreased cost and size. Power consumption, signal interactions, and parasitics phenomena all result in integration bottlenecks. With current technology, advanced simulation tools and the ability to reuse data are the bases for working with such complex designs. The complexity of designs continues to increase, but a shorter time to market and less amortization for design costs create a design- and verification-cycle bottleneck. One design option is to interactively run tools to obtain faster feedback. Additionally, more automation during physical verification can alleviate the situation. Tools that work together, an automated verification process, and tools that allow faster mixed-signal verification can also resolve some of these issues.
Analog challenges in IC design
Since the introduction of the 180-nm process, digital design has acquired analog overtones, with behavior that digital designers attributed to "parasitic" effects. The problem lies in the physical effects of truly tiny process geometries. The original digital paradigm usually allowed the design team to ignore the coupling between signal lines or the transition times down a long line. Problems of noise or crosstalk were the exceptions, but a diligent team could find and fix them.
The growth of digital consumer electronics means that the analog circuit has become the golden child of today's semiconductor industry. Analog circuits play a crucial role in a variety of applications, such as wireless communications and networking, wireless computer peripherals, MP3 players, and digital cameras. This trend has not only made analog semiconductors more important and profitable for chip companies, but also made the job of analog designers much tougher. Ten to 15 years ago, analog designers were likely creating circuits using a few to a few dozen transistors. Analog designs did not need to use the most advanced silicon technology; thus, the designers had access to mature device models and design rules and benefited from the experience gained with previous designs implemented on the same. Basic circuit simulators were adequate to help the designer produce acceptable designs.
If a circuit topology met specifications on the last chip, the designer could be confident that it would work again. After all, it used the same silicon process. If a circuit failed to meet specifications, the designer could base design changes on measurement data from the prototype and respin the silicon. It was feasible to manually debug a circuit with just a few devices and mask sets, because a mature silicon node often cost less than a fancy new one. In this way, the designer used silicon to verify a design. In fact, some designers called this method the silicon simulator. Today, analog designers may face implementing a 2.4-GHz RF transceiver using thousands of transistors in a 65-nm silicon technology that may exhibit wide performance variations across the process, voltage, and temperature ranges. This transceiver shares silicon with millions of gates of digital logic that produce substrate and power-supply noise that your transceiver must tolerate. And the cost of a silicon respin has escalated to more than half a million dollars.
The old method of relying on solely on experience from previous designs and using silicon for verification doesn't work in this case. It is difficult or impossible for the designer to intuitively determine which of the thousands of devices in the transceiver is most vulnerable to the power-supply noise generated by the switching of the digital logic (Figure 1).
Ravi Subramanian, PhD, president and chief executive officer of Berkeley Design Automation, believes that designers need to supplement their experience and intuition with verification tools that accurately analyze all the key characteristics of complex analog circuits implemented on advanced silicon: "The tools need to employ advanced circuit-analysis techniques that accurately capture the complex physical behavior of analog circuits like RF transceivers to ensure working designs and minimize or eliminate the need for the silicon simulator."
Once a product enters manufacturing, analog circuitry presents another new problem: yield variations. Analog circuitry is more sensitive than digital circuitry to process variations, and small variations in a process or fab environment can impact the number of good die on a wafer. Circuit trimming is one technique analog designers use to improve yield. New techniques that allow the use of programmable logic for circuit trimming have simplified the use of this approach (see sidebar "OTP nonvolatile memory for on-chip analog trimming").
Stephan Filarsky, product-marketing manager at Sagantec, explains that, whereas in the digital world the placement of devices mainly influences the timing of the circuit, the layout work concentrates on areas in which the timing is critical. Circumstances differ in the analog world, in which the placement of devices and their interconnections have a direct influence on the overall function of a design. To control this situation, designers may use analog constraints, he says. Today, circuit and layout designers manually handle these constraints, based upon their experience in designing analog layout. Engineers rarely have a way to keep the constraints information with the design database. Because this information is absent from both the layout and the netlist, designers must constantly reinvent it.
In addition, designers spent most of their time and effort in physical analog design on tedious detail work at the device, wire, and polygon levels. At the same time, many final instances of circuit and layout share a similar topology and differ only by device parameters and second-order geometric details. Examples include repeated manual resizing of geometries to match output loads or migration to other processes while meeting symmetry and matching requirements. Automating these geometry and device modifications can greatly impact overall analog-design productivity.
Pankaj Mayor, group director of business development for manufacturing alliances at Cadence, observes that the electronic design process is now so complicated that no one company can do it alone. As Figure 2 shows, historically, the system houses that excelled in vertical integration were the leaders in electronic design. Today, a number of companies in varied businesses must collaborate by design or by chance to ensure successful product development that meets functional requirements, market timing, and development-cost goals. Companies that have successfully developed products at 130- and 90-nm process nodes have pioneered what Pankaj calls virtual reaggregation.
Tom Quan, vice president of marketing at Applied Wave Research, explains, "The separate environments and databases prevent designers from performing signal-integrity analysis early in the design cycle, when it is most critical." Inconsistent environments and poor modeling and extraction capabilities make it difficult to perform effective cosimulation and analysis, in either the time domain or the frequency domain, of the signal trace between the physical-implementation phase at the IC and module/pc-board levels. In addition, a break in the connection between the chip and the package physical-design tools results in more delays and rework due to poor comprehension of the interfaces.
When dealing with hardware design, you can think of design in two general categories: synthesis and custom. For many designers, synthesis tightly couples to digital design because of the low-dimensional complexity of digital logic, and custom closely ties to analog/RF design, because the high-dimensional problems associated with custom designs require manual analysis. In some ways, the resurgence of custom design tightly ties to the recent explosion of consumer products that have significant analog or RF content, such as wireless products. Higher speeds have caused digital to become analog in instances including multiple clock regions, increasingly complex clock-multiplication and -synchronization techniques, noise control, and high-speed I/O. RF-circuit design has seen rapid growth due to the rise in wireless-communications development and demand for such technologies as GSM (Global System for Mobile communications), CDMA (code division multiple access), Bluetooth, Wi-Fi (wireless fidelity), and GPS (global positioning system). This expansion results in a bottleneck in the IP (intellectual-property)-creation sector, because it outpaces the growth in the number of expert, skilled designers.
For example, manufacturers daily produce millions of cell-phone power amplifiers. However, the number of engineers who can design such products grows slowly, and these designers need more expertise than before—in both RF- and IC-design knowledge—to be effective in the industry. Unfortunately, the US education system has concentrated in the last 30 years on developing logic designers who are proficient in digital-logic design but have little or no experience with the physics of electronics and are thus unprepared to deal with analog effects. Other countries, including China, India, Russia, and some European countries, have continued to teach physics and electronics theory to their engineering students and now offer a workforce that is better prepared than US-trained designers to solve today's designs problems. If US universities fail to address this problem, they will dispel the notion that outsourcing jobs overseas impacts only low-skilled workers, and the United States will lose its leadership position in IC design.
Today's designs require complex verification and typically have many more variables than digital abstraction can handle. For example, radio-transceiver circuits have a broad range of requirements, including noise figure, linearity, gain, phase noise, and power dissipation. Advanced simulation and verification tools are necessary to verify these complex designs.
In the last few years, a number of EDA companies have offered analog synthesis tools in the hope of diminishing the effort required to design analog circuits. Most of these efforts have achieved only partial success at best, because some issues need further analysis and improvements in both the tools' functions and engineering methods.
Moving from a behavioral description to a schematic and from there to a completed layout requires many more parametric trade-offs than in a digital environment. The analysis space grows exponentially with the size of the circuit being designed and quickly hits practical limits for computation resources and tool capacities. Engineers can do most of today's analog design work only at the transistor or the layout levels because of the complexity of parasitic behavior, thermal and process variation, and noise interactions.
Many more parameters mean many more constraints. Analog-synthesis results turn out well only when designers specify many and greatly detailed constraints. The time required to develop and specify these constraints works against the time gained in the synthesis process, canceling out a significant portion of the gains that might otherwise be possible.
Much of the success in moving to synthesis in the digital world came about due to the development of the standard cell-library concept. With emphasis on a few key variables, such as delay time and output loading, these libraries are parametric for easy use by relatively straightforward synthesis/timing engines. However, analog IP in general does not lend itself well to this approach. To date, no one has been successful at moving the general case of analog IP from a continuous range of parameters to a discrete range similar to digital libraries. Analog IP in general is also more sensitive to process-variation and circuit-interaction effects, impeding the ability to adequately characterize these libraries. When design requires this kind of attention to ensure first-pass success, it seems unlikely that designers will accept "pushbutton" synthesis any time soon except in the most well-controlled circuit applications and environments.
Simulation required to verify designs is also becoming a greater bottleneck than it is in digital design. Despite much work, Spice, developed more than 30 years ago, is still the circuit simulator of choice. In the last few years, a handful of EDA companies has introduced versions of fast Spice. These simulators trade off some accuracy for simulation speed. They allow designers to verify a circuit to within 5 to 10% accuracy, eliminating the most obvious design problems in less time. Once the circuit reaches this level of accuracy, engineers can afford to invest the time a traditional Spice simulator needs to finish verification. Mentor Graphics offers a single-kernel verification environment that spans the design-abstraction level from architectural to circuit design. It supports a number of languages and modeling levels, such as VHDL, Verilog, VHDL-AMS, Verilog-A(MS), Spice, and C.
A language-neutral environment with multiple simulation algorithms lets designers choose the best combination for their tasks. Combining various modeling technologies in one familiar interface saves time and boosts productivity. Steve Lewis, director of the Virtuoso business unit at Cadence, says that Cadence is continually striving to update its simulation platform to meet the needs of mixed-signal-IC designers. The newest version of the platform, AMS-Ultra, combines Cadence's analog simulator, Specter, with Specter RF to support communication circuit design, and UltraSim, a fast Spice simulator.
Chip layout is also presenting new challenges. Designers still favor grid-based, or Manhattan, layout for large designs, because these methods are faster. However, the time the designer takes to reach a finished design that meets all the rules and constraints, including a high yield, matters more than the time a designer takes to run the routing. Designers of commodity devices trade off many revisions of a mask set, costing millions of dollars, for just a one-percentage-point increase in yield.
Cadence has been experimenting for years with its X-based layout technology, and, although benchmarks have shown some promising results, the industry has not yet widely accepted this technique. Pulsic is proposing a shape-based approach to routing. Mark Waller, Pulsic's vice president of research and development, explains, "Ensuring that both analog and digital portions of an IC operate correctly together and no area is wasted in unnecessary white space or guardbands, is a headache in itself."
Shape-based routing has been around for many years in board design, but its strength in area efficiency, signal integrity, and yield improvement, particularly in analog- and mixed-signal designs, is driving its use for chip design. PC-board-tool vendor Racal-Redac in the 1990s developed shape-based routing. The technique does not use an abstract grid but creates a "flood" in one direction until it reaches an obstruction. It then finds an unobstructed edge in the direction of the target and floods in that direction until it reaches another obstruction, and the process repeats until it reaches the target. The technique assesses the route for its length and evaluates other factors, such as parasitics, giving it significant flexibility. Because it uses real shapes of objects, designers can place tracks as close as possible to obstructions, creating a more compact routing pattern. And, because the net associated with each object is known, designers can apply more powerful rules to control spacing from those objects. For instance, they can use a larger space between analog and digital nets than they might have used between two digital nets.