Verification = IP = Verification = IP… - Part 1: Current Industry Situation and Drivers
Recently, Aart de Geus, Chairman and CEO of Synopsys, gave the keynote at DesignCon 2004. In his keynote, he argued that verification and IP are really two sides of the same coin. Addressing both these key technologies is central to raising design productivity and verification effectiveness. In the first part of a two-part article based on that keynote, Dr. de Geus outlines the new technologies and describes the drivers behind current and future changes in the way we use verification and design IP. Part two will look specifically at the implications for verification and IP solutions.
In the past, mentioning IP and verification together in the same sentence was something of an oxymoron. This is something the industry must fix.
Design reuse by means of semiconductor intellectual property (IP) is becoming the center of gravity for design productivity and the key to being able to produce chips that really work. Clearly, methodology and tools are critical in bringing IP to play in the SoC design environment. But before addressing the technology, it is helpful to consider what has happened to the semiconductor and electronics industries during the past few years.
Design Driven by Differentiation, Cost and Time to Market
In many ways, technology and the economic environment are inseparable. The semiconductor industry is driving growth in many new markets. Consumer demand for electronics in those new markets, in turn, dictates the health of the global semiconductor ecosystem.
In order to be successful in these markets, designers must focus on three elements: differentiation, cost and time to market. The primary concern for designers has to be differentiation. Differentiation in chip design can occur in a variety of areas: functionality, performance, power consumption, form factor and even price. Without differentiation, new chips have no chance in the market.
Cost is a factor that has grown dramatically in importance over the past few years. Cost is multi-dimensional and depends on the parameters of the design, the impact of manufacturing and the support required. Cost is also affected by factors within the business environment; most recently, the economic downturn and the push toward globalization. Both employment and market growth in new global regions will have an increasing impact on design cost.
Finally, we are seeing a return of time-to-market pressures. When there was no market, time to market was a moot issue. Now, it’s back with a vengeance. In fact, time to market has become a function of time to adoption, time to volume and time to yield. The renewed focus on time to market is creating tremendous pressure on design teams, which were already under pressure from the significant downsizing that has occurred throughout the electronics industry.
Recasting the Semiconductor Industry
During the past three years, there has been a major economic squeeze on the semiconductor market. Many veterans of the semiconductor industry would argue that this is nothing new; the semiconductor market has always been cyclical, and it’s entirely a function of supply and demand. However, the last few years have given us a glimpse of a somewhat different picture. Figure 1 shows the troughs, or percentage off the peak, experienced in the semiconductor industry for the downturns that occurred in 1985, 1989, 1996 and 1998. The green line represents the 2001-2003 downturn, which was twice as long and twice as deep as any other downturn in the history of high technology.
FIGURE 1: Semiconductor Downturns 1985 – 2001
In 2001 semiconductor revenues were down 46 percent; in 2002 they were down 32 percent from the peak. These magnitudes are sufficient to suggest that this was not just an ordinary cycle.
Instead, we should think of this latest downturn as triggering a recasting of the semiconductor industry. If we take a look at the fundamental productivity and efficiency of the entire ecosystem, we will find that there is tremendous pressure on the three fundamental drivers of design (differentiation, cost and time to market) and, as a consequence, on the individual designer’s day-to-day job.
The good news is that, looking at the past six to eight quarters, we are seeing a very gradual upward trend. But despite this positive direction, the semiconductor industry still emanates an air of nervousness.
Simultaneous Technology Pressure
Within the context of this economic pressure, we also have a set of dramatic changes to the technology.
The industry has progressed to finer geometry processes on a schedule that has, so far, run almost like clockwork. We saw the move from 0.65-micron to 0.5, then 0.35 and 0.25, to 0.18. Now the geometry of choice for new designs has reached 0.13-micron, where things have gotten much more difficult. Along the way, copper technology was also successfully introduced.
Today, fabrication companies face major costs in moving to 300mm fabrication plants, requiring investment to the tune of approximately $2.5 billion. These kinds of costs have inevitably brought about major shifts in the industry. While the rise in mask costs is not as extreme as some people believe, nevertheless, we are rapidly approaching the $1 million mark for a mask set. In addition, manufacturing yields at 0.13-micron have a long way to go to match previous yield levels. Putting all these factors together, one can begin to understand why design is becoming so expensive.
Add to this increasing cost the fact that design is getting much more difficult. One of the key factors in this is increasing complexity, which drives a major set of challenges in the verification process. And yet, complexity is the very thing that enables us to put more and different kinds of functionality on the same chip.
Timing closure has not gone away as an issue, either, and it is still very hard to predict. Although many steps have been taken to make it simpler, timing closure must be coupled with smaller geometry physics and the effects of signal integrity on the fundamental digital paradigm.
On top of these issues, an old problem we know about but have dealt with reasonably well in the past is boiling up again: power consumption. This factor is driven by the need for longer battery life and the need to have lower power dissipation. Related to that, we have reached the end of being able to reduce supply voltages and we have entered the age of leaky transistors. As a result, power consumption needs to be managed very diligently.
Another re-emerging problem is the cost of test. Test requires that we bridge all phases from design to manufacturing. The age of design as independent from manufacturing is truly over, and the two are increasingly interconnected.
All of these interrelated problems are putting a collective strain on design teams. And yet, there is still an opportunity to differentiate and shine if these problems can be managed.
Driving the State of the Art
To revitalize the semiconductor and electronics industry, it is necessary to drive forward the state of the art. While we seem to be heading in the right direction, there are still some question marks about growth.
One of the fundamental growth questions has always been determining the percentage of semiconductor content in electronic products. During the past 25 to 30 years, that percentage has significantly increased. However, it is also clear that this percentage has flattened out in the last three to four years.
If we look at the applications and products that have driven semiconductor growth over the years, it was first the PC wave, then the wireless wave, and subsequently the networking wave. Now the question is: what’s the next ‘killer’ application?
The challenge with killer applications is always the same – they are far easier to recognize once they’ve “killed” something. Unfortunately, if you wait to identify the killer application, you are too late to participate in its market window. So it’s useful to understand what’s behind all killer applications, and what drivers within the technology we can isolate and identify.
In the first 25 to 30 years of design, there was one driver – computation – the ability to execute functions as fast as possible. Speed was absolutely the key issue.
Then in the 1990s, we saw communications and connectivity become the new drivers. At the end of the ‘90s, many people were predicting that the convergence of those two would be the new driver. There is no question that some very exciting products have brought computation and connectivity together, and these products are increasingly aimed at a consumer market.
The advantage of a consumer market is that, when the wave happens, it happens very quickly and at a very high volume. The disadvantage is that it typically has to be delivered at very low cost. It is apparent that if we continue on the present path, the integration of the PDA, phone, email, Internet access, camera, MP3 player, toaster oven – you name it – is all going to be on a single chip. Consumer demand will drive all of this, and it will require further technology integration and higher performance.
Managing Complexity Growth
We know from collecting customer data on gate counts that in 2002 the highest percentage of designs targeted fewer than a million gates. The next survey in 2003 revealed a big jump in complexity to the five-million gate mark. This jump corresponds directly to the question we see coming from many of our customers today: How do we successfully design much higher complexity chips?
FIGURE 2: Exploding Average Gate Count
Moore’s Law is far from dead. In fact, it almost feels as though it’s accelerating with the increasing number of design issues faced with each successive technology node. Among the many issues, the overriding concerns that our customers have expressed in complexity management relate to verification and IP, and one cannot be solved without the other.
Chairman and Chief Executive Officer Dr. Aart de Geus, co-founder of Synopsys in 1986, is considered one of the world's leading experts on logic simulation and logic synthesis, with more than 25 published papers on the subject. Dr. de Geus is a Fellow of the IEEE and the third-ever recipient of IEEE's Circuits and Systems (CAS) Society's prestigious Industrial Pioneer Award. He holds an MSEE from the Swiss Federal Institute of Technology and a Ph.D. in electrical engineering from Southern Methodist University. Dr. de Geus currently serves as the Chairman of the Board for the Silicon Valley Manufacturers Group, and he was named the 2002 CEO of the Year by Electronic Business.
©2004 Synopsys, Inc. Synopsys and the Synopsys logo are registered trademarks of Synopsys, Inc. All other company and product names mentioned herein may be trademarks or registered trademarks of their respective owners and should be treated as such.Related Articles
- Design and verification strategies for complex systems, part 1
- Rotten to the Core or Core-blimey…Silicon DNA! - Part 1: Getting Ready to Outsource an IP Core
- An Introduction to Direct RF Sampling in a World Evolving Towards Chiplets - Part 1
- Paving the way for the next generation audio codec for the True Wireless Stereo (TWS) applications - PART 1 : TWS challenges explained
- Where Innovation Is Happening in Geolocation. Part 1: Signal Processing
New Articles
Most Popular
- System Verilog Assertions Simplified
- System Verilog Macro: A Powerful Feature for Design Verification Projects
- Enhancing VLSI Design Efficiency: Tackling Congestion and Shorts with Practical Approaches and PnR Tool (ICC2)
- Synthesis Methodology & Netlist Qualification
- Streamlining SoC Design with IDS-Integrate™
E-mail This Article | Printer-Friendly Page |