Linking synthesis with DFT key for network switch ICs
Linking synthesis with DFT key for network switch ICs
By Pradeep Fernandes, Vice President, Product Engineering, Get2Chip, Inc., San Jose, Calif., Ron Press, Technical Marketing Manager, Mentor Graphics, Inc., Beaverton, Oregon, EE Times
March 4, 2003 (1:56 p.m. EST)
URL: http://www.eetimes.com/story/OEG20030228S0049
As the challenges of network execution time reach new heights, the question of integrating synthesis and design-for-test (DFT) in the fabrication of high density networking devices merits reassessment.
The challenges that prompt this reassessment come from many directions: product design requirements, semiconductor process technology, and design tools, for instance. The present development wave in networking has a heavy emphasis on features. Voice over Internet Protocol (VoIP) and security are two areas beyond basic quality of service functions that are being heavily invested in at this time. Meanwhile, a new wave of performance-driven products will begin to enter the design stage. So, economics dictate that more transistors must be integrated at lower cost.
The special demands posed by these networking chips, with regards to synthesis and DFT are: high transistor count (complexity), high frequency of operation, high memory content , and the innate connectivity intensiveness of networking designs. Many of these parameters are enabled by new and smaller fabrication process technologies which pose a major change to test requirements.
Networking designs by their very nature are connectivity intensive. Packets come into chips, they are examined, disassembled, analyzed, prioritized, payloads routed, and reassembled. Every packet of data has bits that traverse a majority of the typical networking chip's functionality, which implies a heavily interconnected topology of functional elements. For this reason, networking designs pose special, but somewhat predictable, problems for the design implementation tools.
Routing congestion is a consequence of the functional interconnectivity. Such designs are difficult for place and route tools to deal with, and the end result is larger die sizes. Another factor arising out of wide bit width datapaths found in many networking designs is the increased "opportunity" for noise and s ignal integrity problems.
Higher wire content and parallel signal paths create problem prone topologies that must be mitigated with avoidance structures (for instance, shielding and placement adjustment), and still further loss of device density. Therefore, it is essential to avoid adding logic to the primary function of these devices to prevent further complicating these design issues.
Test requirements are dramatically changing with 0.13 microns and smaller fabrication processes as a result of a significant increase in timing related defects. Scan-based at-speed test must be performed to maintain quality levels. Such intense test requirements means looking new test methodologies including techniques such as Design-For-Test (DFT), and incorporating hardware in the devices themselves to speed up the test process.
Design, test merge
With regards to the relationship between synthesis and DFT, the required epiphany is that DFT solutions actually re present a full-fledged function of the chip. For instance, a built-in self test (BIST) capability for a cache is a function just as much as a packet CRC (cyclic redundancy check) operation. As such, DFT functions become a part of the design process, not part of an adjunct DFT process.
In this sense, design and test have merged. The implications of this merger represent a conceptual simplification. In this functional view of DFT, functional design methodology is not altered or disturbed. Rather, it proceeds in a more pure form without the intrusiveness of yet another outside influence that needs to be coordinated.
The new burden to bear is added work in functional design that incorporates the DFT functions. Fortunately, the functions themselves are well studied, documented, and often a majority of the DFT logic can be implemented in a non-intrusive manner.
The work in the design process is to decide how to activate the functions during operation, and then creating the correspon ding control structures such that they do not interfere with other required functions.
BIST is widely accepted as the best solution for on-chip memory testing. Board-level test uses standard JTAG structures for individual chip isolation and testing. Custom logic is still most often tested using scan-based test techniques this is where there is a critical link between DFT as a test activity and synthesis.
While design and DFT merge on a functional level, synthesis and DFT are on with apologies to the topology purists out there reconverging divergent paths. Challenges from 90-nanometer process technology are leading design tools down a path of evolution characterized by more detailed analysis and more time consuming solution algorithms.
Tool technologies and algorithms are diverging to best serve their respective domains. It has not been practical to merge synthesis and test a lgorithms, nor has it ever been shown to be beneficial except in the most superficial of ways. S,o while a single push-button methodology appears attractive, the best designs will still come from the best engineers using that best tools at every stage in the process.
To understand the difficulty of merging two complex tool problem domains, consider placement and routing. While the merits of completely intertwined placement and routing have been known for more than 20 years, in practice there have been only small, incremental, migrations of routing prediction into placement, and incremental placement adjustment into routing.
The reason? The number of objects keeps growing at such a rapid pace and the detail required for accurate analysis is so great that it is simply not practical in terms of memory and run times to contemplate algorithms that perform truly simultaneous placement and routing optimization.
In fact, a recent paper out of UCLA showed that the best known algor ithms for placement were from 146% to 238% worse than optimal. This shows that, even in isolation, a single design implementation step is extremely complicated and still holds much room for improvement. Intertwining other, similarly difficult, problems holds no promise for getting better results.
Still, synthesis and test are reconverging in a new way. Leading technology tools are moving toward a smooth flow between synthesis and DFT but need to avoid forcing major synthesis changes to accommodate DFT. Synthesis makes key contributions to design speed, area and readiness for closure. Design-for-test adds the testability to enable DFT tools to focus on pattern generation with minimal interference to the functional design. The end result is effective tests that make sure that defective parts do not slip through the initial screening process after manufacturing.
Emerging 90-nanometer, 100-million gate designs will create new demands for synthesis and DFT. For synthesis, creating a global logical structure that meets timing and area constraints as well as the demands of design closure is critical. And, for DFT, supporting additional patterns required to screen new failure mechanisms demands additional strategies and complications to the pattern generation process. This results in a huge increase in the number of patterns required. To keep rising test costs in check, on-chip pattern compression is a hard requirement. On-chip compression is necessary to allow increased testing, however, any compression technique needs to be non-intrusive.
The logical link
Logical boundaries of functionality have to be drawn for a successful 100-million gate design flow. Espousing the complete integration of major algorithmically complex problems such as placement, routing, synthesis and design-for-test makes for interesting marketing foils and disastrous design projects.
Adding more dimensions to an optimization space that is already bursting at the sea ms of complexity will only ensure that none of the optimization axes are optimal. Test needs to be non-intrusive on design, and design needs to not interfere with test. Quite simply, synthesis needs to do design and DFT needs to do test. Schemes that alter the functional design process and the design itself are headed down the wrong road. Likewise, DFT schemes that insufficiently test the device can create a quality disaster.
Fortunately, the integration points of synthesis and test are well known and very manageable. Definitions of clocks, resets, and initialization sequences are known derivable elements that synthesis tools can readily generate in file formats consumable by DFT tools. The insertion of scan capable register elements is an operation that can be fulfilled in either domain.
Synthesis needs the speed and area characterization of the scan cells, and DFT needs to understand the type of scan clocking and control for the cells. Embedded compression techniques exist that only add logic around the scan chains and avoids interfering with the functional design. The bulk of the DFT work can be performed during pattern generation instead of by adding test logic, area, routing, and performance problems to functional circuitry.
Related Articles
- The Network Switch: Unsung Hero of the Hyper-Scale Data Center
- Switch Abstraction Interface (SAI) - Breaking the Network Aggregation
- Practical Power Network Synthesis For Power-Gating Designs
- Opto-electronics -> Architectural synthesis provides flexibilty in optical network design
- Key considerations and challenges when choosing LDOs
New Articles
Most Popular
E-mail This Article | Printer-Friendly Page |