|
|||||
SoC Test and Verification -> Dense wires snarl verification plans
Dense wires snarl verification plans Deep-submicron silicon technology makes it possible to implement increasingly complex system-on-chip designs, but it is also introducing new design and verification challenges. Unlike in coarser process technologies, the interconnecting wires at 0.18 micron and below have significant parasitic components that affect the timing, crosstalk noise, power consumption, electromigration and voltage drop throughout the chip. The interdependencies of coupling capacitance, wire delays, inductance effects, current densities, lower supply voltages and noise tolerance must be taken into account. Unless these electrical effects are properly managed and verified, chip performance and functionality can be compromised, lengthening development cycles and causing missed market opportunities. Conventional approaches consist of point tools that import design data, perform a certain function and then export the design data for use by the next tool in the fl ow. These disconnected design, implementation and verification flows cannot account for the interdependencies of timing and signal-integrity effects, nor can they identify when additional problems have been introduced. Time-consuming iterations and needless overdesign are often required in such flows. To fully leverage deep-submicron process technologies and efficiently implement system-on-chip (SoC) devices, an integrated design, implementation and verification method must emerge. System-level chips are characterized by high gate counts, high performance, low power and short time-to-market requirements. The architecture may involve microprocessor cores, memory controllers, bus arbiters and other functions that are realized in hardware as well as software to meet the system specs. It may contain tens of millions of gates with operating frequencies in the mid-megahertz to 1-GHz range. The design and implementation of an SoC design involves systems integration, and combining partitio ned hardware and software to produce the final product. The hardware itself contains analog and digital circuits with several intellectual-property (IP) blocks that require integration into the design.
In traditional methodologies, verification occurs after implementation. During this stage, designers must ensure that the design satisfies such system specifications as data rate, bus protocols, timing constraints, clock frequency and power consumption. The other requirement is that physical implementation meet the assumptions of electrical and layout constraints that are specific to the design and the manufacturing process. A possible approach to addressing these challenges would be to use a single, unified data model containing all design data. With the verification functions embedded within the logic-synthesis and physical-implementation environment, and with a ccess to complete and identical design data, the system could work concurrently to analyze the data and take necessary actions. Today, designers must either overdesign, risking performance, or waste time iterating back through the entire flow to solve problems. An integrated system can take these interdependencies into account and make incremental changes. With a single, unified data model, an integrated system not only can share design data between tools but also can share assertion data among the hierarchical levels. Throughout the flow, the system can analyze a unit in the context of its surroundings.
|
Home | Feedback | Register | Site Map |
All material on this site Copyright © 2017 Design And Reuse S.A. All rights reserved. |