What designers need to know about structural test
What designers need to know about structural test
By Marc Loranger, EEdesign
March 6, 2003 (6:34 p.m. EST)
URL: http://www.eetimes.com/story/OEG20030306S0058
Facing both increasing competitive pressures and rising device complexity, integrated circuit (IC) companies are looking for more effective strategies able to speed delivery to market of higher quality products. Until now, manufacturers have relied on engineers to write test programs able to identify failures in device function, and have acquired sufficiently powerful automatic test equipment (ATE) resources to avoid any bottlenecks in production test and product shipment. Rising device complexity, however, drives a proportionally faster increase in test complexity that simply surpasses the capabilities of test engineers and ATE to achieve cost-effective functional test solutions. Now, IC companies are beginning to deploy structural test strategies designed to uncover manufacturing defects within the most complex ICs. At the heart of these new test approaches, companies combine sophisticated design-for-test (DFT) software and flexible test pla tforms using specialized structural test approaches. Together, this combination of test equipment and DFT software promises to help semiconductor manufacturers cut cost-of-test and time-to-volume, even with continued advances in device speed and complexity. Need for structural test Each new generation of semiconductor process technology enables designers to pack more circuitry in a smaller die and deliver ICs with more functional capability. In traditional approaches, increased device capabil ity translates to a greater number of test patterns drawn from larger vector sets. As a result, increased functional capability drives longer test development time, more involved test programs and longer production-test time. Because of the explosion in test complexity associated with traditional approaches, total test costs account for one-third to one-half of the total cost of complex ICs and is predicted to increase. The sheer size, speed and functional capability of leading-edge devices will continue to challenge the ability of test engineers and ATE to deliver high-throughput, reliable production test. At the same time, emerging manufacturing capabilities promise to exacerbate test burdens. Combined with gigahertz clock frequencies, today's process technologies of 0.15 microns and below introduce powerful interactions at the circuit level due to noise and cross-coupling effects, leading to new defect mechanisms that increase test complexity. For example, the use of copper metal layers in advanced process technologies adds another significant type of defect: While aluminum, an subtractive process, typically exhibits shorts in circuits where insufficient aluminum is etched away, copper, an additive process, creates opens or resistive opens defects, resulting in a significant increase in test data volume needed to uncover these defects. Open type defects will often only manifest themselves with specific multiple sequential patterns or through AC testing. The emergence of system-on-chip (SoC) design methods further aggravates test complexity. Advanced process technologies now allow SoC designers to combine digital logic, memory and mixed-signal circuitry on a single die. While the sheer size alone of these devices stretches test resources to their limits, the diversity of circuit types in these devices pushes most traditional methods well beyond their capabilities. Companies can no longer rely on traditional functional test methods to control test costs and speed time-to-volume of today 's more complex devices. Even as increasing device complexity drives sharply rising unit test costs, emerging technologies threaten to hasten a total breakdown in the effectiveness of traditional test methods. Faced with this grim reality, semiconductor manufacturers are looking for alternative test strategies able to address both increasing device complexity and continuing pressure for decreased cost-of-test and time-to-volume. Today, as the industry rides the learning curve to more effective non-traditional test methods, users face significant confusion in available methods, leading to a period of chaos during the transition (Figure 1). During this period, confidence will grow slowly with increased experience with new structural test methods. In the meantime, however, engineers will face a growing gap between emerging structural test capabilities and legacy designs, which lack the fundamental internal mechanisms needed to support these new test methods.
If traditional functional test attempts to answer the question "Does it function correctly," structural test attempts to answer the question "Was it manufactured correctly." Conceptually, it's a much more modest task in production. Rather than test for correct operation of complex functions, structural test attempts to uncover any defects in the underlying gates and interconnect within a device. This distinction is particularly important with the rapid evolution of semiconductor process technologies.
Figure 1 - Transition to non-traditional methods causes chaos
At the same time, new device capabilities like mixed-signal and RF will present significant additional test challenges for traditional test systems, fueling even further increases in capital equipment cost for conventional test architectures. Amidst this confusion, structural test will continue to evolve, offering the potential to reduce device test costs by simplifying the role of production test in uncovering manufacturing defects.
One of the attractive aspects of structural test is its ability to fit quickly and easily into the IC development process. Today, DFT software complements production test equipment in development flows that include:
- Design-oriented DFT methods, which help engineers make designs more testable through the use of specific design techniques such as additional test access points, on-chip test structures like scan and built-in self-test (BIST) and pa th or transition delay test methods. These techniques increase observability and controllability of circuit elements that otherwise require increased test development times to reach - hence increasing cost-of-test and delaying time-to-volume.
- Test assembly and preparation software, which helps engineers assemble test programs and transfer those programs to specific ATE.
- Structural test equipment, which provides specific capabilities needed to support structural test methods like scan and BIST structures.
DFT in design
The earliest forms of DFT focused strictly on the implementation of test structures inserted manually or automatically into the design. Today, however, leading IC companies have come to realize that informed DFT decisions made early in design can result in significant savings later.
As a result, the execution of a DFT strategy begins well before engineers begin designing circuits. Designers enhance testability of their circuits by integrating the most appropr iate on-chip test structures - including simple test points, scan circuitry or sophisticated BIST structures, typically combined with a standard IEEE 1149.1 test access port (TAP) for off-chip control and exchange of test data. During test, ATE reads internal signals through the test points or uses the TAP to activate scan or BIST circuits. When activated, on-chip test mechanisms place the device in test mode, perform the test sequence and deliver test results, often as part of more involved test procedures.
Used for digital logic, scan methods are the oldest and most widely used DFT method. During the design phase, engineers use electronic design automation (EDA) tools to analyze a design and replace flip-flops and latches with scan-enabled registers or synthesize test into a design during logic synthesis. Depending on the nature of the design and its corresponding test requirements, designers can choose to implement full-scan design, where all flip-flops and latches are included in the scan chain, or partial scan, where only some are included.
Later, during production test, ATE uses the TAP to enable the scan chain(s), read in test vectors and read out results. With today's multi-million-gate ICs, however, even partial-scan methods can result in long scan chains that exceed ATE pattern memory. Also, scan tests have a tendency to have much higher device power demands than during normal device operation because scan chains with random data switching simultaneously cause significant power consumption. Advanced DFT methods anticipate these types of operational limitations by adjusting the number and depth of scan chains and the scan chain shift rate to conform to the capacity and performance capability of the device as well as manufacturer's ATE, while maintaining the primary goal of minimizing test time.
BIST methods
Built-in self test (BIST) structures bring test mechanisms onto the chip itself, reducing the amount of data that needs to be exchanged with ATE. Applied routinely to regul ar designs such as memory or datapaths and to an increasing degree to random logic, existing BIST methods enable comprehensive fault coverage with minimal external intervention - a benefit that extends not only to production test but also to ongoing field maintenance. Embedded in the designer's circuits, these digital BIST elements can execute a range of tests needed to assess device health.
During production, ATE can activate the BIST sequence and read the pass or fail result. For device debug, design tools interact with the device-under-test (DUT) to diagnose the failure quickly. During field operation, field test equipment (or even a chip's host system) can similarly determine if the BIST-enabled device is functioning properly.
Today, SoC designs often need to use mixed-signal blocks for peripheral or interface functions, significantly complicating traditional test methods. Mixed-signal BIST offers a means for addressing this new challenge. In fact, mixed-signal BIST fills an existing need for p erformance testing of today's high-speed digital designs. Performance testing of these circuits is essentially an analog function, and mixed-signal BIST can address current problems in achieving clean signals through conventional ATE methods. By placing mixed-signal BIST instrumentation literally within microns of the circuit under test, engineers can avoid the precision-testing problems that arise with external measurement of high-speed signals despite engineers' best attempts to create suitable DUT load boards.
By confining complex tests or precise measurements to the chip itself, mixed-signal BIST relieves requirements for more expensive ATE configurations. Designers can employ BIST mechanisms that test analog or mixed-signal circuits, but deliver a digital result, permitting the manufacturer in some cases to use existing digital testers even for some mixed-signal ICs. In fact, by handling pre-processing and data reduction through BIST mechanisms, these methods simplify and reduce the amount of proce ssing required by the ATE, permitting companies to use lower cost ATE to test more advanced ICs.
Mixed-signal BIST is emerging in a variety of forms, ranging from general-purpose BIST building blocks to application-specific BIST. Delivered as pre-built intellectual property (IP) blocks, general-purpose BIST enables designers to incorporate self-test functions with little or no customization to the IP block. Placed alongside the designer's circuit, this type of standalone test instrumentation operates independently of the design but provides test capabilities needed to handle complex test requirements.
For example, general-purpose BIST IP can measure time delays between clock edges with picosecond accuracy. Less than 1000 gates in size, this BIST IP stores results in a histogram placed in on-chip (or off-chip) memory. These measured results are then sent to the ATE via IEEE 1149.1 interface.
Application-specific BIST IP provides self-test functionality for particular mixed-signal circuits. Unli ke general-purpose BIST, BIST IP is typically customized by the engineering team to meet a specific design's requirements. As such, this approach results in BIST functions that are tightly coupled to the designer's circuit. For example, BIST IP functions let designers integrate these test functions and tune the BIST to match the converter architecture, number of bits, voltage ranges and sample frequency.
As with any silicon-based solution, BIST also increases a design's power while under test and area requirements, even if only to a modest extent. In practice, BIST experts look to achieve a reasonable balance between on-chip test capabilities and ATE capabilities. For example, instead of using separate BIST functions for several subcircuits, designers might use a single BIST and control its interaction with those subcircuits through one or more multiplexers - trading increased test time for reduced silicon real estate.
Distributed BIST strategies extend this concept by allowing designers to positio n test instrumentation on-chip or off-chip, selecting the location best suited to overall design goals. In this approach, designers can choose to implement BIST functions on the chip to maximize test speed and precision, or they can minimize silicon area at a slight cost in test time by placing these functions off-chip in field-programmable gate arrays (FPGAs) placed on the DUT load board. Although this off-chip method cannot achieve the same level of performance as pure on-chip BIST methods, it can help IC development teams meet increased test requirements with existing test equipment - a fundamental goal of any effective test methodology.
Test assembly and preparation
In creating an SoC design, designers combine dozens of blocks of pre-built IP cores delivered from different sources. Because each IP core often employs a different test strategy, engineers face a particularly difficult challenge in melding these diverse test requirements into a single, efficient test solution, a process somet imes referred to as test scheduling.
If these diverse test methods are simply connected end-to-end sequentially, the result would be long test times or worse. Besides a higher cost of test, this type of test might be impractical if the result required either multiple insertions or different test setups to accommodate different combinations of scan, BIST and functional test methods for both digital and analog/mixed-signal cores. Sophisticated design-to-test tools handle these potential problems, managing the diverse formats, translations and constraints needed to prepare test programs capable of lowering test times while ensuring high test coverage.
Despite a development team's best efforts, manufacturers sometimes do not discover problems with test programs until the beginning of production test - the worst possible time, because production resources have been committed and expectations for delivery have been set. However, engineers can verify test programs prior to silicon availability and without access to actual ATE systems. Test verification software provides precise models of a wide variety of ATE test execution environments, including complete models of the memory organization, sequencers, formatters, control and timing. Combined with models of the test fixture and DUT itself, this environment allows test engineers to analyze detailed timing using the actual test system source program code and vector sets.
After design and test assembly, development teams must port resulting test programs to specific test equipment. With traditional methods, this porting process has relied on a combination of manual efforts and automated conversion methods using scripts and system utilities to prepare test procedures for ATE. Currently, design-to-test tools automatically perform the necessary conversions and checks needed to transfer test programs reliably to production ATE environments.
Production testers
For traditional functional test, production ATE must support large numbers of high-spee d input/output (IO) pins to drive and compare wide test patterns needed to confirm device function. Larger, faster ICs dictate the need for correspondingly larger pin-count test heads, higher performance instrumentation and more sophisticated test equipment to ensure high-throughput production test.
In contrast, a pure structural test methodology would in principle significantly simplify the requirements levied on production test equipment. While traditional ATE needs to continue to match device pin count, structural test systems can conceptually support substantially fewer pins - in the limit, just the handful needed to work with a device's test access ports.
Besides the ability to control primary IO ports, these testers need to test DC parameters of primary IO pins of the device not connected to the test access port. Structural testers do need to provide an extremely large data store for scan based serial data - and would need to continue matching increasing design size with correspondingly large r data stores. Overall, however, structural test offers considerably more constrained requirements on testers than functional test. As a result, testers built solely to handle structural test problems can conceptually be much less expensive than conventional functional testers.
The realities of design and production cloud the vision of pure structural testers (and contribute significantly to the current "period of chaos" described in Figure 1). Traditional test requirements remain a current reality and are likely to present continuing test challenges in the coming years. In particular, the growing prevalence of analog/mixed-signal cores in complex ICs demands increasingly complex mixed-signal test capabilities. According to industry reports, analog content represents 12 percent of circuits in 2002 and is expected to rise to 30 percent in 2003 and to 75 percent in 2006.
With analog BIST still in its infancy, the need to address increased analog/mixed-signal circuitry on more complex ICs presents a s ignificant barrier to adoption of dedicated, structural-only testers on the factory floor. On the other hand, traditional production ATE is capable of handling mixed-signal test but offers support for structural test as limited extensions to expensive traditional test instrumentation. Furthermore, these systems are generally ill-suited for dealing with the large amount of data associated with structural test scan chains. On these systems, large data volumes mean increased test time, which becomes very expensive in conventional "big iron" ATE.
A promising approach lies with the growing integration of structural test capabilities in flexible test system architectures. With these test systems, manufacturers can adapt the base platform with the specific test capabilities required for a mixed-signal IC that also employs scan for digital test. In this case, engineers can configure the test platform with the most appropriate combination of analog instrumentation and structural test capabilities.
Furthermo re, engineers can use design and test software to maximize the efficiency of structural test methods for specific testers, feeding knowledge about structural test to these systems. With this software, engineers can debug silicon and diagnose faults more effectively, because scanned patterns are more than a sequence of bits; the software ensures that test retains internal knowledge about the relationship these test vectors have to the portions of the device being tested. This will greatly enhance the engineers' ability to interpret the massive bit streams associated with structural test methods and achieve faster device debug and bring up.
The flexible software architecture of these systems also allows test engineers to take advantage of the growing availability of third-party software solutions from EDA vendors. The ability to integrate important new software capabilities will become increasingly important in tightening the link from design to test, particularly in addressing evolving analog/mixed-si gnal test challenges.
Conclusions
In today's technological and economic climate, IC companies are looking for more effective means to control rampant test complexity and cost. Traditional IC design methods defer test considerations until late in development, when the cost to accommodate test problems in an IC can skyrocket. In turn, traditional functional test methods require increasing numbers of test patterns to confirm correct operation of increased functional capability in new ICs, resulting in longer test times and increased cost-of-test.
Structural test seeks to uncover manufacturing defects, confirming correct operation of a circuit's underlying gates and interconnect without regard to the circuit's overall function. At the foundation of structural test, DFT methods engage multiple stages of development - including design, test assembly and production test - to achieve a better overall solution to emerging test challenges. Because decisions made in separate phases of development c an significantly impact test, a company's ability to smooth the transition between design, test and production engineering can deliver significant savings and foster a fundamental improvement in product quality.
In the face of rapidly rising device complexity, DFT software plays an increasingly vital role in achieving this smooth transition from design to production test. DFT addresses the objectives of reduced test time and test complexity by increasing design testability, simplifying test program development and easing transfer of test programs to ATE in production.
During design, DFT software helps engineers insert test structures such as scan and BIST elements that simplify test requirements and reduce eventual test times. During test program development and assembly, DFT software helps engineers verify those test programs in virtual test environments so that final test programs will function correctly on the factory floor. Finally, DFT software helps speed the conversion and transfer of test d ata and routines between test development and test systems - avoiding any last-minute snags in production schedules.
Today's test systems provide the kind of highly flexible test platforms needed to address the growing diversity of device test challenges. Using structural test capabilities such as specialized scan data memories, these systems can efficiently handle scan chains and BIST structures as manufacturers move to structural test methods. At the same time, these test platforms can be easily configured to accommodate varying combinations of structural test and functional test methods that may be employed for individual mixed-signal devices. Deployed through sophisticated DFT software and flexible test platforms, structural test helps IC companies achieve significant improvements in cost-of-test, time-to-market and time-to-volume for increasingly complex ICs and SoC devices.
Marc Loranger is a senior director of marketing for Credence Systems Corporation, located in Fremont, California. Cr edence is a provider of solutions for design-to-production test for the worldwide semiconductor industry.