The history and future of scan design
EE Times: Design News The history and future of scan design | |
Kirk Brisacher, Rohit Kapur, and Steve Smith (09/19/2005 9:00 AM EDT) URL: http://www.eetimes.com/showArticle.jhtml?articleID=170701653 | |
For more than four decades, scan technology has somehow eluded the radar screen of the IC test industry. As test continues to evolve and make significant newsworthy changes, scan has maintained a relatively quiet — albeit restless — existence. Recently, however, scan has been making some long-overdue noise. With the emergence of several promising new techniques, such as scan compression, scan technology is suddenly difficult to ignore. These improvements lead some to believe that scan could be poised to become the next focal point of IC test. To understand why scan technology remained in the background of IC test for so many years — and to appreciate its recent advances — you need to peek under the radar screen and examine its genesis, evolution, and possible future. The genesis of scan design Since the inception of IC design in the mid-1960s, IC test has been an integral part of the manufacturing process. Initially, tests were either randomly generated or created from verification suites. But as chips got larger, this process required a more targeted approach, one that needed to be easily replicated from one design to another. This led to the invention of scan, which made designs combinational and simplified the test generation process. With scan, state elements of a design were daisy-chained to provide stimulus and observe points internal to the design. Figure 1 shows how scan chains are implemented using multiplexers to create a shift register out of the existing state elements in the design.
Figure 1 — Scan chains implemented in a design for test.
For the purposes of test, designs were converted into combinational circuitry. While various flavors of scan state elements were adopted, the fundamental underlying technology and advantages of scan remained the same: scan made IC test practical and automatable. A scan test pattern was usually applied in the following manner:
2. Shift values into the active scan chains. 3. Exit the scan configuration. 4. Apply stimulus to the inputs and measure the outputs. 5. Pulse clocks to capture the test circuit response in flip-flops. 6. Set up the scan chain configuration. 7. Shift values out of the active scan chains. 8. Exit the scan configuration. The benefits of scan are clearly apparent: it is easy to understand, implement, and use. Multi-dimensional simplicity is its most important feature. Area and timing impact prevents acceptance With scan’s inherent simplicity, some industry insiders expected it would eventually be implemented in every design.
Wrong.
Scan’s adoption curve was not steep. It had one big problem: it added extra logic, which was considered overhead. Because multiplexers were included in a design’s functional paths to allow for the scan configuration, gate delay ensued, thus creating timing overhead. This glaring deficiency was compounded because test played second fiddle in the design orchestra — the needs of test were not as important as the negative impact of test to the design. Scan R&D efforts eventually matured, but the technology never flourished. Effort at that time focused on mitigating the negative impacts that scan introduced to the design. For naysayers of scan, functional test methods were a popular alternative. The quest for low impact scan techniques led to test solutions with increasingly complex flows. One technique, partial scan, attempted to scan the fewest possible flip-flops in a design while ignoring inconsequential flip-flops. This approach required a relatively small number of stimulus and observe points compared to those created by scanning every flip-flop in a design. Although test quality was uncompromised, partial scan techniques gained only limited acceptance. Other variances also developed: internal scan, boundary scan and built-in self-test. Each approach utilized scan chains in a different manner, and each had its advantages. But because of their complexity, none became widely popular. Synthesis adopts scan Scan could not gain acceptance until the test industry figured out how to make it transparent to designers. A turning point finally occurred when synthesis flows integrated scan technology. By blending within the design flow, scan could be easily implemented, as long as:
Today, some design libraries don’t even have an option for non-scan flip-flops. Along with its impact on traditional pass-fail testing, scan continues to be recognized for it benefits in debug and diagnostics. Scan enters the nanometer era With the emergence of nanometer technology in the mid to late 90s, scan technology faced yet another series of challenges. As nanometer-level designs became more complex, the number of flip-flops that required scanning grew at a much faster pace than the available scan-inputs and scan-outputs. The delta quickly stretched to its limits. Figure 2 shows how the number of flip-flops increases with design complexity.
Figure 2 — Trends in flip-flop count with design size.
Flip-flop count contributes to more stimulus and observe points created by scan, thereby increasing the test data volume (TDV). With increasing length of the scan chains, the shift operations were also an issue since they took as many clock periods as the longest scan chain. As a result, the test application time (TAT) required to operate the scan chains increased. Nanometer technology also created fault coverage issues. Larger designs meant more faults, which required more fault coverage, which meant more test patterns. A moderate increase in test pattern count was tolerable. However, the real problem was this: existing fault models could not meet nanometer quality requirements, and any type of fault model enhancement would inevitably create an overwhelming number of test patterns. With so many obstacles to overcome, was scan yet again in danger of being obsolete? As it turns out, nanometer technology was a blessing in disguise for scan technology. Although it created an unwanted increase in tester memory requirements and test cost, it also brought along the following unexpected favors:
Scan compression borrows a methodology exploited by partial scan methods, which is that fault detection requires only a small percentage of the stimulus and measure points of the inputs and outputs. Previously, the typical practice was to fill all remaining stimulus points of the pattern (logic X’s) with random values. Scan compression, however, enables the introduction of combinational logic in the scan path at scan-in and scan-out. TDV and TAT benefits are achieved by converting data from a small scan interface at the design boundary to a wide scan interface internal to the design. This allows for many more scan chains in the design than can be handled by its signal interface. Scan compression, and other recently introduced techniques, have swept in a new wave of optimism for scan enthusiasts. Some think that widespread adoption of these approaches could change the face of the test industry. Next generation scan technology Figure 3 shows the basic architecture of a modification to existing scan solutions. This approach uses combinational elements that easily integrate with traditional scan solutions. The use of combinational circuitry in the scan path ensures smooth implementation. Sequential elements for scan compression are eliminated, which avoids problems with scan chain extraction, clocking, and integration with existing scan methodologies.
Figure 3 — Adaptive scan architecture
Multiplexers placed at the front of scan chains enable adaptive alliances between the scan inputs and the scan chains. Using this methodology, one setting of the multiplexer control signals can direct a scan chain to connect to a particular scan input. In another setting, the same scan chain can connect to a different scan input. When multiple scan chains connect to the same scan input, they receive the same scan data. The adaptive scan architecture enables multiple changes to the multiplexer control for the same test, thus providing access to a large number of configurations for ATPG tests. Using this approach, the required test pattern set can be applied. The XOR circuitry at the output allows for multiple internal scan chains of the design to be connected to fewer scan outputs. The output circuitry has redundant connections to allow for X-tolerance and ensures better diagnosis of the test data. When you apply this compression architecture with a few combinational gates, a large number of internal scan chains are accessible through a much smaller scan interface. This compression architecture has a direct positive impact on test data volume and test application time. Where does scan go from here? Over the years, scan has quietly adapted to the ever-changing needs of design and manufacturing. Once an afterthought, it has evolved into an efficient and sometimes indispensable test methodology. With promising new technology developments, scan appears poised for even greater advances and usefulness within IC test. The 21st century marks the beginning of a new era for scan. It remains to be seen as to what the next driver of change is going to be for scan. Could it be power issues, debug or design for manufacturability? Kirk Brisacher is Director of Corporation Applications Engineering for Test Automation at Synopsys. He has over 25 years of experience in IC design and management. Rohit Kapur is a Synopsys Scientist who guides the development of Synopsys design-for-test (DFT) solutions based on Core Test Language (CTL) and other open standards. He is chair of the Core Test Language, IEEE P1450.6, standard committee, and was named IEEE Fellow in January, 2003 for his outstanding contributions to the field of IC test technology. Steve Smith is a Senior Technical Writer in the Test Automation group at Synopsys. He has more than 20 years of professional writing experience, and has been a technical writer in EDA since 1995.
| |
All material on this site Copyright © 2005 CMP Media LLC. All rights reserved. Privacy Statement | Your California Privacy Rights | Terms of Service | |
Related Articles
New Articles
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- Synthesis Methodology & Netlist Qualification
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
- Demystifying MIPI C-PHY / DPHY Subsystem
E-mail This Article | Printer-Friendly Page |