Design Con 2015

Test trends: Commercial scan compression tools

-July 17, 2013

About a dozen years ago, the world of test had reached an economic impasse: most digital designs had become sufficiently complex that standard scan testing techniques were no longer cost-effective. Because scan chains were too long, it was taking too much time to scan data through them when applying manufacturing tests. Moreover, even though at-speed testing had become essential for screening nanometer defects, many low-cost and legacy testers did not have the memory capacity to store all the data.

The emergence of commercial scan compression tools at that juncture changed the economics of test. Instead of connecting flops together into a few very long scan chains, they created hundreds of short chains connected to a compressor-decompressor (CODEC). Compression enabled substantial cost savings through test application time reduction (TATR) and test data volume reduction (TDVR). TATR lowered costs for semiconductor manufacturers testing parts in high volume because more parts could be tested in less time. TDVR reduced memory storage requirements to accommodate both stuck-at and at-speed tests to improve defect coverage.

Compression requirements evolve
Since compression’s early successes, designers worldwide have embraced it as a design-for-test (DFT) methodology essential for lowering test costs and enabling high test quality. But compression requirements have evolved in the intervening years, making it necessary to also evolve the compression technology. Four manufacturing test trends have had the most influence on the direction of the technology.

The first of these trends is fewer pins available for testing. Test pin constraints have been driven by an increased focus on packaging costs and tighter form factors, especially for portable applications. Another consideration is that SoCs are now comprised of dozens of internal and third-party IP, often including multiple processor cores that have their own embedded CODECs. Because chip-level test resources must be shared among all the cores in a design, there are fewer pins available per core for test access, increasing the need for high compression with few test pins.

The demand for pin-limited compression has been further stimulated by the adoption of multisite testing, which is a cost-saving technique that screens multiple die simultaneously to reduce test application time. Today, up to 30% of cores and designs allocate fewer than seven pins for test. In a few years, more designs will require compression that utilizes only a single scan I/O pair.

The second trend is more logic on-chip. Assuming the same number of scan channels, doubling the flop count more than doubles the amount of compression needed to maintain the same test data volume and test cycle count. In a few years, systems-on-chip (SoCs) will likely grow twice as complex; hence the need to scale TATR and TDVR higher as SoCs increase in complexity.

The third trend is increasingly subtle manufacturing defects. To address nanometer test quality requirements, a growing number of semiconductor companies have begun deploying techniques such as slack-based transition testing and bridging testing that complement standard transition delay testing. Also, below 20 nm, on-chip process variations give rise to fault effects that require additional defect testing. All this extra testing increases test execution time, data, and cost. Even higher compression will be needed to meet these more stringent TATR/TDVR requirements.

The fourth trend is faster scan test operations. Test time and cost can be further reduced by scanning test data in and out of designs faster, using a higher tester clock frequency or internal clocking methods to increase the rate at which internal scan chains are shifted. Scan rates have increased in the last few years to the extent that most SoCs are now scanned at 50-100 MHz. Because these rates will continue to increase steadily in the years ahead, compression tools must have the ability to limit power consumption during scan shifting and ensure timing closure for the test data paths.

Need for new innovation in compression technology
Over the past decade, commercial test compression tools have been enhanced to address some of these requirements. Improvements in fault coverage and compression results, however, have been offset by an increase in DFT implementation complexity amid shrinking design schedules. For example, today’s DFT flows tend to combine elements of bottom-up and top-down integration where CODECs are embedded in each core and at the top level. Insertion of dedicated clock controllers and other test logic at the top level of a design may be needed to meet the test goals of the blocks, complicating the design’s clocking scheme and timing behavior.

Even more challenging is the fact that a large SoC might require variations in DFT architecture—for example, insertion of observation points or special X-tolerant logic to improve test coverage and compression performance—across different blocks. Significant engineering insight is required in these situations to determine which architectural variants to implement. The additional time and effort for these modifications can be costly, and if DFT issues are encountered late in the project, they can impact a design’s tapeout.

New innovation in compression technology is therefore needed to address the DFT closure challenges as well as the test cost and test quality requirements of the next generation of SoCs. The technology must deliver 2-3x higher compression, utilize a single scan channel, and support scan rates above 100 MHz while simplifying the DFT implementation process itself.

Loading comments...

Write a Comment

To comment please Log In

FEATURED RESOURCES