55nmHV MTP Non Volatile Memory for Standard CMOS Logic Process
Single tool serves IC verification best
Single tool serves IC verification best
By Joseph Sawicki, EEdesign
April 2, 2002 (2:33 p.m. EST)
URL: http://www.eetimes.com/story/OEG20020402S0027
Semiconductor companies have traditionally supported design flows that incorporate a wide range of EDA tools from many different vendors. By choosing best-in-class tools for specific sections of the design flow, CAD teams can create an environment that allows designers to complete projects on schedule. However, because physical verification is the common thread throughout a design, from layout to silicon, employing multiple physical verification tools can create discontinuities that result in errors, delayed tape-out, manufacturing problems, and missed time-to-market windows. Companies have traditionally supported a dual IC physical verification tool flow -- separate tools for interactive (cell/block) and batch (large block/full chip) verification. Different tools are chosen for each flow based on the type of design component and the way in which users employ the tools. SoC designs require design rule checks (DRC) and layout vs. schematic (L VS) physical verification at both the interactive and batch phases of the design. Interactive users perform verification on smaller cells and blocks at the beginning of the design process, or when creating standard cell libraries or blocks. Designers working at this phase of the design require constant interaction with the layout tool. They will launch the verification run, and debug the results, from within the layout environment, without instituting any setup requirements. It's a highly interactive cycle of layout, verification and debugging; leaving the layout environment for any reason is disruptive to the process. In contrast, the batch tool is typically employed at the full chip integration level, when the size of the design is beyond the ability of the interactive verification tool, or when accuracy is of primary importance. Batch users require more comprehensive commands and complex processing modules than their interactive counterparts. To achieve top performance, the batch tool takes advan tage of advanced verification modules such as hierarchical checking and multi-processing runs to shorten overall verification time. Due to the large number of polygons being processed, verification run times may take hours to complete. Because of this, batch users will often start a run, perform other tasks until the run finishes, then return to begin the debugging phase. The batch tool is also used as the designated sign-off tool, validating the chip for tape-out and delivery to the fab or foundry for manufacturing. Dual verification environments that use separate verification tools require separate rule files. Discontinuity between these separate rule files can cause serious ramifications when cells and blocks are integrated at full chip. Without continuity, manufacturing problems can result. Two tools create problems In addition, a dual flow model requires that the interactive and batch physical verification tools, and their respective rule files, be maintained separately. This separation creates discrepancies between the tools, their rule files and verification results. For instance, finding a cell design error during a batch verification run, which was missed during the interactive verification run, brings the confidence of the entire physical verification flow into question. When an "errors found, errors missed" discrepancy occurs, a designer must find out why the error was missed and what course of action to take before proceeding with verification. Just fixing the error could adversely affect a part of the design created by a different design team. Tracking down these discrepancies often involves multiple designers and CAD engineers. Together, they must determine which tool is correct in error reporting, who should "own" or fix the error, why there was a discrepancy between the tools, and how to eliminate these discrepancies in future designs. Discrepancy errors suggest one of two situations. Either the interactive rule file is not synchronized with the foundry-standard (or golden) batch rule file, or the interactive tool cannot be coded for many of the complex checks required for current deep submicron processes. Once the discrepancy is understood and the error is "owned," designers must fix the layout and CAD engineers must update the verification flow. However, if the error is contained in library cells or an IP block, designers may not be free to make the corrections or changes. Revision control makes updating difficult, and purchased IP may not be guaranteed to function correctly if modified. Solving these issues not only us es up valuable CAD resources, but also contributes to delays in the design and tape-out schedule. Understanding why discrepancy errors occur is key to preventing them in the future. Each verification tool has its own processing engine and rule file syntax. Processing engines may work in similar fashion but differ greatly in performance and capability. Interactive verification tools, which are tuned more for speed, may not include commands that enable it to perform the complex checks of a batch verification tool. Many times, certain rules simply cannot be coded for interactive tools; this inability can lead to component integration problems when the batch tool is employed. This is the main reason that errors are found during the batch mode in cells and blocks that were verified clean by the interactive tool. Rule files written for both the interactive and batch flows are coded to check for errors based on a rule file specification defined by the foundry or fab. These design rule specs are created to ensure manufacturability and the highest possible yield. Within semiconductor companies, rule files are the implementation of specifications. Depending on the nature of a given design style, rules may be added to further enhance yield and performance. Creating and maintaining these rule files can be difficult and time consuming. In a dual flow environment, CAD engineers must duplicate their efforts to support two different rule files, making sure they are both written and synchronized to meet the rule specification. Supporting a dual verification flow requires dedicated resources. These CAD teams are responsible for flow creation, qualification, implementation and day-to-day support. A two-tool verification flow can take many man-months to create and implement; much longer than a single tool flow. And, upon completion, additional CAD resources will be needed to respond to flow discrepancy issues. The power of one By providing a simple graphic user interface to the layout tool, interactive designers can quickly perform iterative verification tasks while maintaining their unique and productive usage model. Extending full-chip verification performance and capability to the cell/block phase gives the interactive user the confidence that cells and blocks are properly designed and not cause errors or delays during full chip assembly. Engineers performing batch verification retain the speed, capacity and scalability they need to verify large blocks and full chips with the batch/sign-off tool, while they gaining flexibility and insight on large SoCs that contain a wide variety of analog/mixed signal components. A faster design cycle that helps meet or beat t oday's aggressive time-to-market goals is also a major benefit. This enhanced design cycle arises from three sources: First, a single tool flow that is completely compatible and integrated allows designers to make use of their already extensive inventories of full-chip rule files for cell-block verification. In addition, interactive designers can employ the rule files provided by the world's leading foundries. Second, because discrepancies between out-of-synch rule files and tool limitations are eliminated, designers save valuable time during chip assembly. This is important in meeting tape-out deadlines. Physical verification errors can be found and corrected early in the design process rather than later, when iteration bottlenecks can delay design schedules. Third, a single tool model saves designers time by allowing them to gain familiarity and a deeper level of expertise and proficiency in one tool With these experts on hand, new team members can be easily trained, questions resolved more qui ckly, and design decisions made more swiftly. At the same time, a single tool model reduces the number of support issues that result from multiple verification tools. Joseph Sawicki is the general manager of the Physical Verification and Analysis division of Copyright © 2003 CMP Media, LLC | Privacy Statement
Having two tools capable of performing similar, finely tuned tasks within their respective flows sounds positive on the surface -- but on the contrary, it creates an environment prone to problems. Fine-tuning indicates that each tool must be constantly calibrated to produce the same results any and every time either verification flow is modified. This takes valuable time and resources and can delay implementation of necessary flow updates.
Using a single and powerful physical verification tool for both inter active and batch verification benefits both usage models. Designers performing interactive verification may easily invoke DRC and LVS, select specific checks, verify portions of a design, and graphically debug results without leaving the layout environment.
Related Articles
New Articles
- Quantum Readiness Considerations for Suppliers and Manufacturers
- A Rad Hard ASIC Design Approach: Triple Modular Redundancy (TMR)
- Early Interactive Short Isolation for Faster SoC Verification
- The Ideal Crypto Coprocessor with Root of Trust to Support Customer Complete Full Chip Evaluation: PUFcc gained SESIP and PSA Certified™ Level 3 RoT Component Certification
- Advanced Packaging and Chiplets Can Be for Everyone
Most Popular
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- System Verilog Assertions Simplified
- Smart Tracking of SoC Verification Progress Using Synopsys' Hierarchical Verification Plan (HVP)
- Dynamic Memory Allocation and Fragmentation in C and C++
- Synthesis Methodology & Netlist Qualification
E-mail This Article | Printer-Friendly Page |