The best way to find out how real design teams are selecting and evaluating intellectual property is to ask them. That is precisely what EE Times intends to do in the IP Selection Industry Challenges program. The first step in the research process has already been taken a series of two focus groups gathering data about how engineers view the IP selection issue and what they are doing about it. Data from these focus groups will be used to create a Web research questionnaire, from which we will gather global data on a larger number of design teams. This data will be reported in early December in another special section in EE Times and in a panel discussion at the IP/SoC Conference in Grenoble, France. Early results from the focus groups are interesting in their own right. They confirm the intuitively obvious, underline the importance that engineers place on the selection problem and open up some surprising avenues for further exploration. We present a summary of the results here. IP: Why bother? A reasonable place to start with a study of IP selection is with the question of why engineers seek IP in the first place. The answer is supposed to be obvious time and labor savings. But the obvious isn't always exactly how designers see the issue. Participants in the focus groups were, for the most part, senior designers or chip architects with significant experience in the selection and integration of IP. The majority developed system-level ICs, using either an ASIC or customer-owned tooling (COT) flow, but a significant number targeted FPGAs for their design work. Interestingly, some of these engineers were ASIC designers by trade, but deteriorating market conditions had forced their organization to target FGPAs instead. The amount of IP used in designs ranged widely: Some respondents reported that they used only a few peripheral interface blocks that made up about 10 percent of the design, while others said the reused content in many of their chips was upward of three-quarters of the die area. The type of IP used also varied widely, from interfaces, I/O controllers and serializer/deserializer blocks to embedded SRAM, CPUs and specialized processing elements. One respondent commented on seeing an increase in the availability of general-purpose IP that could be targeted to any of a variety of applications. There was general agreement on why a design team would decide to search for reusable IP. The most-cited reasons were meeting schedule with the available design team, focusing internal resources on the blocks that were critical to the market success of the product and reducing risk. This latter point caused some discussion, as one participant used the phrase "known-good IP" and others questioned whether there was such a thing. That debate would reappear throughout the focus groups. Anticipating the problems As a group of experienced IP users, the participants universally went into the IP selection process looking for problems a surprisingly long list of problems. Roughly, the list could be split into business-related and technical issues. On the business side, many engineers cited contract negotiations as a potential problem that needed to be checked out early in the selection process. Several reported having been involved in designs where negotiations over IP licenses or royalties were still going on right up to tapeout of the chip or shipment of the FPGA. "If the IP is seen as critical, or if there are royalties involved anything that would encumber the design in the future top management and the legal department will get involved," one engineer said. "And then things can really drag on." Another commented, "Sometimes it's not the negotiation getting to the point where we can sign the papers. Sometimes just getting the papers signed can take a long time." Another significant business issue was the stability of the IP vendor. Obviously, if the IP was coming from within the organization as often happens in large organizations, according to these engineers stability wasn't an issue. But as one designer said, if you want leading-edge ideas in your IP, you will be talking to small IP companies. They may not be around through your whole design cycle to support you, and they may not last long enough to follow up on their road map for your next design. If questions arose about the vendor, participants reported, management usually became involved in the decision process. And this could be an issue for engineers. One suggested that if top management got involved, they might do 30 percent of the work in investigating the decision but exercise a veto over the outcome. The other key business issue was, predictably, price. Engineers tended to take a very nuanced approach to pricing models, trying to stay within both design budget and unit cost targets while being perfectly willing to pay for quality and support. However, at least one engineering manager among the group took a dim view of the nuance, observing that "when the engineers get involved in price discussions, bad things tend to happen." The technical issues Functionality was an issue. It might seem that the first thing an IP vendor would do would be to ensure that the IP behaved as advertised. But that turns out, according to the participants, to be a very elusive goal. Several engineers explained the issue. When you are initially searching for IP, one said, often all you have to go on is the functional description on the Web site. It may be nothing more than a blurb. Even if the description refers to an industry-standard interface, it may not be saying much. There are different implementation levels for many standard interfaces, and wide ranges of parameters. And with new so-called "emerging standards" (a good candidate for contradictory phrase of the year, by the way) everything is still open to interpretation. Every IP vendor may have a different view of what the standards documents mean. So claims of compliance don't mean interoperability. Even when you dig deeper, another designer mentioned, you don't get the full story. "Maybe you get a four- or five-page data sheet. You can't really document what a complex IP core does in that space." Another added that buying an IP core is not like buying an off-the-shelf IC. There are many more variables. "The core looked great on the Web site," one engineer commented. "But once we got into it, there were some major differences between how it worked inside and what we were suspecting. We ended up paying the vendor to modify the core." But more serious than basic functionality, according to several participants, was the problem of performance. The block may in fact function generally as advertised, but it may not come close to its rated clock frequency. Or it may turn out to be much larger than suggested when synthesized for its full speed. Or it may have an inappropriate appetite for milliamps. Power in particular was cited by one engineer as a growing issue. "In the past, I never even looked at the power dissipation of a core," he said. "But now, we are watching power very closely, and have to understand all the modes the core uses for power savings." A particular problem in this area was mentioned several times trouble with parameterizable or configurable cores. One engineer mentioned a parameterizable core that blew up to several times its original size when set for a particular number of I/O pins. Another mentioned that power consumption could increase, or speed decrease, just as nonlinearly as you turned the knobs. This seemed to be of special concern to FPGA users, who are resource- and speed-constrained to begin with and seemed to observe the most radical surprises from their IP. None of these issues are obvious from the literature on the Web site, or sometimes even from the deliverables with an evaluation license. You don't find out until you are synthesizing the specific configuration you need. But designers must become skilled at spotting these issues on the front end, before the team is committed. The verification issue That raises the whole issue of verification. According to the participants, the level of verification support IP vendors make available with their deliverables ranges from out-of-date to lame to downright wrong. Nor were all the verification issues on the vendor side. It appeared that almost no IP vendor uses current verification techniques. No participant had ever seen a core that made correct use of embedded assertions, for instance, or came supported by a formal verification tool. Even relatively standard modern techniques like input generators and output checkers were far from universal. Coverage metrics were another area in which vendors disappointed. Few provided figures for verification coverage, or even tools for measuring it. And needless to say, the coverage often left a great deal to be explored. Worse, there were serious questions about the candor of the vendor's supplied verification tools. "You'd have to be stupid to use the vendor's data checkers," one engineer said flatly. Another observed, "You always get test vectors with the core. They represent what I'd called a sunny-day scenario. The vendor is not going to point you at the corner cases or the problems he might know about in the design." In other cases, the models provided by the vendor turned out to be just wrong. One participant told of a design in which there was a serious error in the simulation model for a RAM compiler, causing a fault in the silicon. Hard macros and memories came in for particular concern in this regard. "Take a multiport memory, for instance," an engineer said. "You can't determine if the model is correct until you have the silicon." This would appear not to be a problem with soft IP. After all, you can just feed the RTL into a simulator. But even here there are issues. If the RTL happens to be in VHDL not uncommon for IP and you are using a Verilog flow, "there are still real problems with mixed-mode simulators," as one participant put it. Another mentioned that if there are asynchronous paths within the RTL, it may simulate incorrectly even if the RTL is correct. But if the vendors aren't bending over backward to help with the verification process, often schedule and resource constraints prevent the customer from doing much better. Several designers reported that they never verified an IP core in isolation before integrating it into the design. Their attitude was that they didn't really care if it worked in isolation they cared only that the chip worked. So the block went untested until full-chip verification was started. This level of trust was, unsurprisingly, more common among FPGA users than ASIC or COT designers. Another issue cited by the majority of participants was compatibility with their design flows. This, a number of engineers pointed out, was far more than a matter of convenience. If IP has been developed with one synthesis tool, several people said, it can turn into a major separate project just to get the RTL to synthesize properly with another tool. Simulation could turn out to be even more difficult, particularly if there were a language problem. Coping strategies Given all these potential problems, it might seem that the risks of IP selection just aren't worth the benefits of using IP. Yet, experienced design teams, according to these participants, found ways of at least reducing the risks during the selection process. The most prevalent and, some suggested, the most reliable strategy was to use a known vendor. Several participants said that a vendor they dealt with regularly was always the first place they looked for new IP. The best predictor of downstream success or trouble, many of the participants believed, was the past performance of the vendor. This led some engineers to place a high value on one-stop shopping. If they could get all their IP from one vendor, not only would they be dealing with a relatively known quantity, but they would also have leverage if anything did go wrong. An engineer described one project in which the design team relied upon an ASIC company for all its IP, and was able to negotiate a free mask spin. But several participants said that a longstanding relationship and a great reputation were only necessary, not sufficient. "Even with a name company, you have to ask all the right questions," one summarized. In some cases, asking the right questions could become a serious ordeal for the vendor. An architect from a major SoC company described subjecting his IP vendors to formal design reviews and methodology audits. The vendor got exactly the same treatment as an internal design team would. While some designers scrutinized the vendor, others, whether through preference or lack of political clout, scrutinized the IP itself. They described building testbenches for the RTL that came with the evaluation license, simulating the block together with the blocks that would connect to it, and exploring the RTL itself. But many were unsatisfied with simulation they wanted proof. Several designers said that they wanted and some insisted upon references to other design teams that had already taken the core into production. Failing that, they wanted the vendor to deliver a test chip adequately provided with hooks for verification or an FPGA implementing the IP. In some cases, there is no substitute for battering the design with huge numbers of test vectors, especially using pseudorandom techniques, they said. The relationship card Another strategy also emerged from the focus groups. Many participants said that the last resort, should problems arise during the design, was to work directly with the engineers at the IP vendor organization. Hence evaluation of the vendor's engineering resources, and of their willingness to share them, became an important strategy for risk reduction. The phrase "we talked to them" came up in a number of instances. This evaluation cannot be quantitative, nor can it be conclusive. You don't really know if the vendor's management will be as accommodating after the license is signed as they were during the sales cycle. But for many designers, it was an important step in the final phase of the evaluation process to get to know their counterparts in the vendor's design team. In this regard, the evaluation process took on more of the nature of a hiring interview than a component selection. The questions were about skill levels, understanding and openness. And the bottom line was, as one participant phrased it, "Are you going to be there?" Strategies, then, ranged across a wide spectrum. Some FPGA designers admitted to a purely ad hoc selection process, lacking both the time and resources to do anything more complex. Others had elaborate procedures for IP selection and dedicated engineers to the task. One designer said that his company had an entire department in the home office responsible for IP selection and qualification. The thing all these processes had in common was the desire to use as little engineering resource as possible to buy down the risks very real risks of using IP in the design. Whether this meant spending a little more money, spending some vital engineering time or dedicating a department to the task, it was an exercise in risk management. The trade-off was based, for the most part, on intuition, but it was for most design teams a decision. See related image |