SAN JOSE, Calif. — Mentor Graphics Chairman and CEO Walden Rhines provided the DVCon conference here Tuesday (Feb. 15) with roadmap for achieving progress in the battle for design verification productivity. Rhines described the need for better verification. "Two- thirds of chip designs require at least two spins," Rhines said. "Three quarters of those cases are at least in part due to logic errors or other functional bugs — 80 percent of the time involving a design error." So what are design teams doing about it? Just working harder with the existing, increasingly incapable verification methodology, Rhines said. He noted that in another recent study only about a quarter of engineers said their job was verification; those who instead said their job was design reported spending half their time on verification tasks. "Complexity is breaking the methodology," Rhines said. "In response, verification tools are changing quickly today. But engineers don't like to change. Most engineers — although there are a few advanced teams — most engineers will keep using the same methodology until it completely fails, not matter how slow or hard it gets." Rhines described a two-phase shift away from the existing verification style. First, he predicted, teams would find that old techniques simply fail to cope with the complexity of their next design. "Then the tipping point," he said, "will come with the availability of new standards — System Verilog, VHDL, SystemC and PSL — that will reduce risk of adoption, improve reuse and create some real market competition for a new way of conducting verification." Alternatives are available today. There are three major changes in verification methodology: the use of assertions, reliance on accurate coverage metrics and a higher level of abstraction. "Experts tell me that you should have an assertion in your code just about everywhere that you would put a comment: An assertion every ten lines is not unrealistic," Rhines said. "This represents a big increase in front-end work, but it pays off." But it only pays off, he continued, with accurate coverage metrics. Rhines dismissed code coverage as not even measuring functional coverage in a real design. He said that coverage was a two-dimensional problem. Rhines said increasing the level of abstraction in verification was just as important as it was in design. "At the algorithm level we have to move from looking at events to looking at transactions," he said. Similarly, at the code level, there needs to be a shift from examining RTL to working with synthesizable C-code or with preverified modules. At the methodology level, there is a need to move from nuts and bolts verification tools to what Rhines called verification appliances, and to verification kits and verification IP libraries. In the future, Rhines said, verification engines will emerge that combine simulation and formal analysis. Tools will also emerge that can examine a design and automatically sequence the verification algorithms that will be applied to it. These tools will be directed to completion by meaningful design metrics. These measures, Rhines maintained, would launch a new and productive era for design verification. |