NoC Silicon IP for RISC-V based chips supporting the TileLink protocol
Heavy rules hold back 90-nm yield
EE Times: Heavy rules hold back 90-nm yield | |
Ron Wilson, David Lammers (03/28/2005 9:00 AM EST) URL: http://www.eetimes.com/showArticle.jhtml?articleID=159906187 | |
San Jose, Calif. Some silicon engineers want to chuck the rule book. Design rules the conventional way of communicating yield information to design teams as a list of geometry do's and don'ts have proved incapable of ensuring a chip that will yield well at the 90-nanometer node, they say, and it's time for a model-based approach.
As design rules multiply like Tribbles, designers are lobbying for a move from rule-based inspection of completed design files to design-for-yield, anchored in process models, throughout the flow. The calls have publicly focused on needs for the 65- or 45-nm nodes. But the absence of enabling tools threatens to stall widespread adoption of the 90-nm node for years, experts say.
Industry insiders are reluctant to discuss yields, especially in quantitative terms. But from a range of conversations and presentations, including a plenary session here at last week's International Symposium on Quality Electronic Design, a discouraging picture of the 90-nm node emerges.
"In IBM's 90-nm process, there are 180 rules just for the Metal-2 layer," Janusz Rajski, chief scientist in the Design Verification and Test Division of Mentor Graphics Corp., said in an ISQED talk. Rajski's data listed a total of more than 650 rules for the process and projected exponential increases as designers move toward 65 nm.
"Eventually, increasing design complexity will no longer be supported by traditional geometric rules," said Mark Mason, reticle enhancement technology (RET) manager at Texas Instruments Inc. "A model-based, design rule checks solution will be required."
The underlying issue is not that the new processes are unstable. In most cases the causes of yield loss are understood at a physical level, and many are modeled accurately. Rather, the problems arise because individual process steps that produce the desired result on one pattern may produce something useless on a similar but subtly different pattern.
The most widely discussed examples are in lithography. Mask makers and lithographers have gone to great lengths to transfer patterns whose sizes are a fraction of the 193-nm wavelength of the light used in the steppers. RET adds features to the masks that trick the light into producing the desired patterns. The process is repeatable and well-modeled. But the formation of a feature on the die is not determined simply by the corresponding features and RET elements on the mask; it may be influenced by everything on the mask within a few wavelengths of the feature in question.
This manifests itself in pattern sensitivities. For instance, the formation of a line segment may depend on whether there are right-angle corners of other objects in the area around the segment. It may also depend on how many other objects surround the segment or, more accurately, on the spatial frequency in the area. More perniciously, it may depend on whether or not there are similar, parallel segments in the neighborhood, regularly spaced at a particular pitch the problem of forbidden pitches.
To ensure that patterns will be successfully transferred to the wafer, in addition to using rules to insert RET artifacts, the pattern would have to be run through a lithography simulation. That result would have to be compared against the original design, either through polygon comparison or through extraction and comparison of electrical characteristics.
But those computations are beyond the expertise of most design teams, and in any case the sheer time and data-set sizes involved would be daunting. So teams elect to follow the design rules, more or less rigorously, and see what happens.
What happens is that some patterns fail, get flagged by failure analysis and wind up enshrined in new rules (see story, page 62). Rules are not enough
"In the past," said TI's Mason, "all that existing knowledge has been encoded in geometric design rules," called DRC decks. "However, the complexity of design-for-manufacturability analysis is increasing, and traditional design rules are not able to detect all problems."
Lithography is the classic case. Litho problems are "frequency space" problems, but design rules work in "real space." At the 90- and 65-nm nodes, particularly, TI has seen "the increasing complexity of lithography translate directly to increased design rule complexity," Mason said.
And lithography is not the only factor. The etch processes used at coarser geometries have proved problematic at 90 nm, in part because they impart their own distortions in transferring the pattern from the photoresist to the underlying wafer. But there is another pressure on the etch step, said Ashok Sinha, senior vice president and general manager of Applied Materials Inc.'s Etch Products Business.
"As the cost of fixing problems in lithography goes through the roof, etch has been asked to take up some of the slack," Sinha said. "This has led to new techniques and essentially all-new materials for the photoresist, anti-reflection layer and hard mask and to a proliferation in process architectures for etch." For designers, that means etch must be included in the model right along with lithography, if RET is to accomplish its purpose.
The story is similar for chemical-mechanical polishing. CMP has become so pattern-dependent that at least one equipment maker has completely revised the design of its CMP stations to minimize such systematic variations.
Others say CMP is pretty much through and are trying to find a way back to plasma-etch-based planarization. In that case as well, the physics, though more difficult, are understood, and modeling of the CMP step is possible. But it's excruciatingly difficult to capture in design rules. From rules to models
"Manufacturing capability will need to be encoded into models that can be used to evaluate the manufacturability of the design from lithography to CMP, corrosion, timing analysis, etc.," said TI's Mason. "For these models to be applied at 45 nm, the DFM prototype will need to be deployed to designers soon, and those tools must integrate with the existing design flow and the existing RET tools."
Some believe the solution is to extend technologies already developed by process integration engineers so-called technology CAD (TCAD) to create process compact models (PCMs) that encapsulate the most important parts of the TCAD model into a rapidly computable form. Physical synthesis, place and route tools can then see how the models do on yield.
"The solution will be to use TCAD across the design and manufacturing flows," said Srinivas Raghvendra, senior director for DFM solutions at Synopsys Inc. "TCAD provides insight into many of the physical effects that impact manufacturability and yield. Through TCAD simulations, PCMs that capture the relationships between key device parameters and individual process parameters can be generated and used for achieving robust processes and designs."
The models "can also help identify problems that need to be addressed by design technology before a process goes into volume production," Raghvendra said. PCMs can provide the accurate and stable quantitative models needed to create yield-aware structures, allowing accurate yield prediction early in the design process."
Other experts are moving in this direction as well. For example, KLA-Tencor Corp. has a program with Toshiba Corp. and Integrated Systems Engineering to use PCMs to link process variations to electrical variations, so that design teams might estimate the variations in electrical parameters they would see on their design.
But PDF Solutions Inc. is skeptical that PCMs alone can convey all the necessary information. "Early on, we looked into using TCAD to convey yield information to engineers," said John Kibarian, president and CEO of PDF. "But we found that TCAD by itself didn't represent some things correctly unless it was augmented with lots of measured data. So we shifted our emphasis from making TCAD designer-friendly to creating test patterns and making exhaustive measurements to characterize the parts of the process that engineers absolutely must have quantified." Where to use it
"We need to anticipate poor interaction between process and design, and make our layouts correct by construction," said TI's Mason. "Often, we find that the designer could have easily made a better choice for manufacturability with no performance penalty, [but] the designer was unaware of some intricate process nuance."
PDF's Kibarian advocates that ap-proach. "You need to have accurate, quantitative comparisons of yield impact between two alternative layouts as early as floor planning, synthesis, placement and routing," he said. "If you really want quality in your design, the place to build it in is during the design, optimizing yield right along with timing and signal integrity."
But that means the models must be very fast and computable. "We can model all the pertinent aspects of the process using TCAD," said Mark Pinto, chief technology officer and senior vice president at Applied Materials' New Business and New Products Group. "But the questions is if they are tractable. We need to embed yield awareness in the cell designs, but we also need simple models that are usable by synthesis and routing tools."
Getting them, he said, "will require simplifications, in the way that BSIM simplifies the results of Spice. I don't see anyone really being there yet,."
Currently there are three approaches to estimating yield for a specific pattern during the design flow. One is to perform pattern matching, usually using two-dimensional Fourier transforms. The tool finds a best match between a local area in the design and a pattern with known yield.
The second strategy is more preventative to look for specific known-bad ideas within the design and quantify the benefit of removing them. The third is to scan the design for problem areas using some fast-executing heuristic and then do a compact process simulation on the suspect areas.
But another alternative unwelcome to many engineers may loom. As pattern sensitivities increase in the march toward smaller geometries, achieving a compact, computable model may prove so difficult as to become untenable.
"We may eventually be driven to simply selecting from a library of known-good physical patterns that have been approved by the engineers," warned PDF's Kibarian. "I can see this happening if we end up doing the 45-nm node with 193-nm lithography."
TI is looking at a combination of PCM-based simulation to estimate yield and restrictions on allowable patterns to limit the input range and make the simulation problem computable. "There is a creative genius in the design community when it comes to drawing things we didn't expect," Mason said. "This makes me skeptical that pattern matching could ever cover the range of inputs. Even for simulation, it may be necessary to limit designers' choices somewhat."
The battle to predict yield based on design choices is just beginning, and its outcome may be foreordained. Yet it's a battle that must be fought.
"I don't think we can expect to get better yield on this generation than on the previous," HPL's Yeric said. "But we have to solve the systematic yield problems to get even close."
| |
All material on this site Copyright © 2005 CMP Media LLC. All rights reserved. - - | |
Related News
- UMC Expands X Architecture Support -- First Pure-Play Foundry to Provide Qualified 90-nm Design Rules
- Synopsys Achieves Two IP Firsts: 65-nm PCIe and 90-nm USB Compliance Utilizing Common Platform Technologies
- eASIC rolls 90-nm structured ASIC line
- Mosis offers IBM 90-nm process on MPW
- TSMC: Consumer market, 90-nm driving foundry sales
Breaking News
- HPC customer engages Sondrel for high end chip design
- Ubitium Debuts First Universal RISC-V Processor to Enable AI at No Additional Cost, as It Raises $3.7M
- TSMC drives A16, 3D process technology
- Frontgrade Gaisler Unveils GR716B, a New Standard in Space-Grade Microcontrollers
- Blueshift Memory launches BlueFive processor, accelerating computation by up to 50 times and saving up to 65% energy
Most Popular
- Cadence Unveils Arm-Based System Chiplet
- Eliyan Ports Industry's Highest Performing PHY to Samsung Foundry SF4X Process Node, Achieving up to 40 Gbps Bandwidth at Unprecedented Power Levels with UCIe-Compliant Chiplet Interconnect Technology
- TSMC drives A16, 3D process technology
- CXL Fabless Startup Panmnesia Secures Over $60M in Series A Funding, Aiming to Lead the CXL Switch Silicon Chip and CXL IP
- Blueshift Memory launches BlueFive processor, accelerating computation by up to 50 times and saving up to 65% energy
E-mail This Article | Printer-Friendly Page |