New approach moves logic BIST into mainstream
New approach moves logic BIST into mainstream
By R. Kapur, R. Chandramouli and T. W. Williams, EEdesign
October 14, 2002 (5:56 p.m. EST)
URL: http://www.eetimes.com/story/OEG20021014S0081
Logic BIST (LBIST) technology has been in use for decades. However, it did not enter the mainstream until recently. This article explores the traditional LBIST "Stumps" architecture and examines a new approach, a deterministic LBIST technology, which uses the LBIST architecture as a decompression/compression machine. Traditional LBIST architecture The Stumps LBIST architecture shown in Figure 1 below includes an LBIST controller, a pseudo-random pattern generator/linear feedback shift register (PRPG/LFSR), a multiple input signature register (MISR), scan chains (called channels), and XOR circuitry (or other simple combinational circuitry between the channels, the PRPG, and the MISR). The Stumps LBIST architecture works on the principal that random patterns generated with a state machine-the LFSR-will expose faults in the circuit under test. Given a starting state (or seed pattern), the LFSR cycles through a predictable but randomized sequence of states. The logic stimulus that is derived from these states is shifted into the scan channels through a spreader network that is usually constructed of XORs. The values in the scan channels represent the stimulus portion of the test pattern.
The logic BIST architecture most widely used to apply patterns and observe responses on a chip is the self-testing using MISR and parallel shift (Stumps) register sequence generator. The basic mechanism uses a pseudo-random pattern generator (PRPG) to generate the inputs to the device's internal scan chain, initiate a functional cycle to capture the response of the device, and then compress the captured response into a multiple input signature register (MISR). The compressed response that comes out of the MISR is called the signature. Any corruption in the output signature indicates a defect in the device.
Figure 1 -- Stumps LBIST architecture
During one or more clock pulses, the response of the circuitry to the test pattern stimulus is captured in the scan channels. The captured response is then shifted from the scan channels into another state machine, the MISR. (Many such tests are applied, and the scan-in operation of a test pattern is overlapped with the scan-out operation of the previous test pattern.)
After the tests are applied-provided the circuitry was free of defects-the MISR that was initialized to a known state can be expected (that is, predicted by simulation) to be at another known state. Any other state reached by the MISR at the end of the test indicates defects in the circuit under test.
Improvements to Logic BIST
The early adoption of LBIST was deterred by its high area overhead, its inability to diagnose problems, and its inability to achieve high coverage. However, LBIST technologists continued to improve upon the technology and have brought it into the mainstream. Key areas of improvement are described below.
Aliasing reduction
Much research and effort has been spent to reduce the amount of aliasing that occurs with LBIST. Data compaction produces a certain amount of aliasing, which means that the signatures of a good device and a failing device will be the same. Aliasing can occur in the event of a failure, and the erroneous response that results can be masked over time.
High-fault coverage increase
The goal of LBIST is to produce higher quality results with fewer patterns. Solutions and methods that have increased LBIST's high-fault coverage with reasonable test patterns include test points, cellular automata implementations of the PRPG, weighting the bits produced by the PRPG, polynomial switching methods, and reseeding methods. Of these methods reseeding has shown the most promise.
Diagnostic improvement
The initial LBIST binary search algorithm has evolved into a much-improved directed-binary search. Diagnostic techniques were developed that converted the design i nto a regular scan design and performed diagnostics in a deterministic test environment.
Bringing logic BIST into the mainstream
In the past, no matter what improvements were made to the technology, the ability to automate the design for test-package the tests with the design-never happened. The revolution that brought LBIST into the mainstream began in reaction to the phenomenal growth in the cost of test relative to the cost of silicon and focused on addressing this need for DFT methods to reduce the cost of test.
When the cost required to create chips began to exceed the cost to test the manufactured chips, the design community was awakened to the need for a low-cost test solution that would not impact the design. Contributing to this changing cost-to-design vs. cost-to-test equation were the following:
- Transistor size was reduced, as predicted by Moore's Law.
- Wafer sizes increased.
- Test costs had reached $2 to $4 million.
As a result of this filtering process, a new breed of LBIST began to surface as the most attractive alternative: an LBIST decompression/compression structure on chip that was capable of applying deterministic test patterns.
Figure 2 - Deterministic Logic BIST architecture
Figure 2 shows the basic architecture for deterministic Logic BIST, which leverages the basic LBIST architecture to achieve several objectives:
- Support for a large number of parallel internal scan chains to reduce test application time.
- An encoding of scan test data in terms of external BIST seeds for substantial reduction in test data volume.
- Significant reduction in the number of pins required to test the device.
- Implementation of the new logic BIST architecture within the design flow to enable easier adoption of the methodology and simultaneously minimize the impact to the designer and design flows.
Logic BIST tradeoffs and future issues
The new breed of LBIST offers decompression/compression techniques that provide an alternative for applying deterministic test patterns using reseeding technology. The benefits of this alternative approach can be compared to stored test pattern testing. This new breed of LBIST offers:
- Significant test-data-volume reduction over the traditional ATPG test patterns.
- Significant test application-time reduction because the number of scan chains is no longer dependent on the number of inputs and outputs of the design. When scan chains translate to shorter scan chains, less time is needed to shift test patterns in and out of registers.
- Seeds represent test patterns. Because the PRPG has the ability to generate many tests from a single seed, determining the optimal balance between seeds and test patterns generated from a seed will become an issue for future solutions.
- Although the number of scan-channels of the LBIST architecture could be arbitrary and large, the decision has major impact on routing. The greater the number of channels, the more test application time benefits, but at the expense of degrading the routing aspects of the scan channels.
- Encoding test patterns as seeds has its limits on the profile of the test patterns that can be successfully represented as a seed. This limit is related to the length of the LFSR used in the LBIST architecture. Thus, area overhead can be traded off with the ability to obtain seeds for patterns.
LBIST technology is ready for prime time
LBIST technology has been in use for decades. However, it did not become mainstream until recently due to number of fundamental issues:
- Most LBIST tools are point tools that are not integrated with a synthesis solution - leading to a broken design flow.
- Typical LBIST methodology does not yield predictable fault coverage due to the random nature of the patterns, and potentially lowers the test quality of the product.
- Lower test coverage leads to addition of test points and augmentation with ATPG vectors - a combination that can lead to multiple design iterations for test point insertion and potentially higher test time.
Rohit Kapur is a principal engineer of Synopsys Test Automation Products division, with research interests in VLSI test. He is chair of the standardization activity (IEEE 1450.6) to create a core test language (CTL).
Mouli Chandramouli is a product line manager for Synopsys T est Automation Products. He has more than 15 years of experience in multiple disciplines of test across many companies (Sun, Intel, Mentor Graphics, LogicVision) in the electronics industry. Mr. Chandramouli is a Senior Member of IEEE and Chairman of the Test Development Working Group in VSIA.
Thomas W. Williams is a chief scientist at Synopsys. His research interests are in VLSI test. He is a Fellow of the IEEE and a member of the IEEE Computer Society and the ACM.
Related Articles
- The case for logic BIST
- BCD Technology: A Unified Approach to Analog, Digital, and Power Design
- Certifying RISC-V: Industry Moves to Achieve RISC-V Core Quality
- Procrastination Is All You Need: Exponent Indexed Accumulators for Floating Point, Posits and Logarithmic Numbers
- From a Lossless (~1.5:1) Compression Algorithm for Llama2 7B Weights to Variable Precision, Variable Range, Compressed Numeric Data Types for CNNs and LLMs
New Articles
Most Popular
E-mail This Article | Printer-Friendly Page |