Proactive approach needed to overcome 90-nm quality challenges, experts say
EE Times: Proactive approach needed to overcome 90-nm quality challenges, experts say | |
Ron Wilson (03/22/2005 8:25 PM EST) URL: http://www.eetimes.com/showArticle.jhtml?articleID=159904460 | |
SAN JOSE, Calif. Speakers at the International Symposium on Quality Electronic Design here Tuesday (March 22) spotlighted challenges faced by design teams and process engineers as they move to 90 nm and beyond. John Kibarian, president and CEO of PDF Solutions, called for a shift from reactive measures that shore up chip manufacturability after the design process is completed to proactive steps that are integrated into the design flow. Kibarian warned that just following design rules isn't enough. "Process engineers have added more recommended rules for 90 nm," he said. "In addition to the mandatory rules, which you must follow, there are all these others. They tell you that if you don't follow them, you risk poorer yields." Kibarian said the first step in making the transition to proactive design for manufacturing is accurate characterization of interactions between design and yield elements. Characterization must be done with measurements of actual silicon, he argued, and the characterization must be very precise, over a large number of wafers. "We need to be characterizing the yield impact of individual features to within a fraction of a failure-per-billion," he said. "That quality of data will allow us to predict yield to within a few percentage points." Another step is to use the characterization data to model the process yield across the process window. The third step is to use the models to quantify the decisions that must be taken during the design process for yield management. Ashok Sinha, senior vice president and general manager at Applied Materials' Etch Products Business Group, offered an equipment-maker's perspective to the question of fine-geometry yields. While process and EDA tool designers are trying to find ways to characterize the variations in advanced processes, equipment makers are trying to find ways of reducing them. This struggle has perhaps been most obvious in lithography, where increasingly stern measures have produced 45-nm images on the wafer using 193-nm light. But Sinha said there is more to pattern transfer technology than just printing. What happens after the photoresist has been exposed can be equally important. Janusz Rajski, chief scientist in the Design Verification and Test Division of Mentor Graphics, returned to the tool front. He reiterated Kibarian's emphasis on characterization based on actual production wafers. "Failing chips in particular are a gold mine of information. We must feed back data from them into the design process," he said. Rajski described a chain of circumstances beginning with the increasing use of reticle enhancement technology (RET) at 90 nm. "At 90, without RET there would be no yield," he said. "The 90-nm processes today use some form of RET on about 13 layers. The 65-nm processes that are being developed now will use it on 28 layers." But Rajski said that along with the benefits of RET come side effects, particularly in the form of pattern-dependent results. What the enhanced masks actually print on the wafer depends not just on the particular feature, but on the features and their enhancements that surround it, sometimes for a considerable distance. Process engineers have tried to alert design teams to the issue with added design rules often, recommended or suggested rules that are not to be mandatory but invaluable in increasing yield or narrowing variations. "IBM has 180 rules just for metal-2 now," he said. "We need to move from choosing and following rules to doing yield optimization." The stakes are high, Rajski said. Without new methodologies, he suggested, based on trend data, that mature-process yields could level off at 50 percent or worse for 90- and 65-nm processes. Rajski suggested a methodology that starts with an identified design-rule violation. First, the new approach would use yield sensitivity functions based on measured historical data to predict the yield impact of a particular feature in violation. Then, the estimate would be combined with data on timing slack and used to guide development of design-for-test structures and software. In this way, the riskiest areas of the wafer would receive the most thorough testing. The technique would be used in combination with what Rajsk called defect-alert test, which employs more robusts tests, especially for detecting subtle timing faults. "The point is not just to improve yields," Rajski argued. "It is to reduce escapes from the test process." The idea is to employ massive post-processing of actual test data to assign defect rates to particular specific rule violations, he said, then use that data to steer both design decisions about correcting or accepting the violation and subsequent test strategy.
| |
- - | |
Related News
- Synopsys Achieves Two IP Firsts: 65-nm PCIe and 90-nm USB Compliance Utilizing Common Platform Technologies
- eASIC rolls 90-nm structured ASIC line
- Mosis offers IBM 90-nm process on MPW
- TSMC: Consumer market, 90-nm driving foundry sales
- Philips delivers industry's most advanced 90-nm ARM9 microcontroller
Breaking News
- QuickLogic Announces $6.575 Million Contract Award for its Strategic Radiation Hardened Program
- Micon Global and Silvaco Announce New Partnership
- Arm loses out in Qualcomm court case, wants a re-trial
- Jury is out in the Arm vs Qualcomm trial
- Ceva Seeks To Exploit Synergies in Portfolio with Nano NPU
Most Popular
E-mail This Article | Printer-Friendly Page |