Electronic Design

Smart Grid Conference should be on your calendar

 

 

 

The Smart Power Grid Technology Conference (May 12 at the Santa Clara Biltmore, should be on a lot of calendars this coming May.  Smart grid tech is a major new industry that is receiving a lot of bad press from a lack of knowledge and the only way to overcome that is by educating ourselves.

In particular, the session on  "The Transformation of Ratepayers into Customers" will be a significant piece of information. Tom Tamarkin, president of the Uttility Services Customer Link, will set out a definition for the term "smart meters" that takes into account the concerns of the public.

Additional topics include, "Enabling the Smart Connected Home," and "the Role of Smart Lighting in the Smart Grid."

Accellera announces new standards submissions

At the Design Automation Conference last month, Dennis Brophy, vice chairman for the 10-year-old standards organization, let slip that several standards resulting from the merger with SPIRIT last year would be submitted to IEEE.  (Yea we know this is a month old, but it hasn't been announced yet and we just fixed a major tech glitch in this site's database following a major upgrade.). follow this link for the interview. Watch video live on Vpype Live Broadcaster

Nabto offers small-footprint IP for portable webserver

On my annual visit to the Embedded Systems Conference searching for interesting companies, I found these guys: Nabto.  This Danish company was, in essence, the only interesting company on the exhibit floor, and not just for their technology but for their lack of hyperbole about who they were and what they did.  Bravo.

This is an unsponsored report courtesy of New Tech Press

IMG giving fits to Nvidia

A few weeks ago I went to the ARM TechCon3, formerly the ARM Developer Conference (Still can’t figure out why they changed the name, now it makes no sense). I went because I was getting a flood of requests for interviews. When I was plowing through the information, none of it really caught my attention, but the PR guy for Imagination Technologies was particularly persistent. When I asked for some details about what we might talk about, he directed me to a couple of web pages with typical marketing blather: lots of unsubstantiated hyperbole, or as the Bard put it “sound and fury signifying nothing.”I decided to go to the conference armed with the marketing crap and start asking questions based on the material and see what would come out of it. I was pleasantly surprised that Peter McGuinness of Imagination Technologies was actually able to answer my questions intelligently and convincingly. He was the only one I met who could. Well done, Peter. I actually learned something. Imagination Technologies supplies IP to large processor companies like TI and Freescale to help them eat away at market share from Nvidia in the handheld market. They seem to be doing well at it. Here's the interview.

New CEO at Tanner EDA Brings SAAS perspective

Greg Lebsack took the reigns as CEO of Tanner EDA less than four months ago, and still had time to talk to us at New Tech Press. He came over from the same position at ASP Global Services (SaaS supply chain software) and says EDA is an industry with “potential for growth” which either means he’s smoking something really interesting, or he sees something a lot of people are missing.  At least it’s good to hear from someone with a fresh perspective.

THIS IS AN UNSPONSORED PODCAST COURTESY OF NEW TECH PRESS

Verigy takes test a step forward

Finding something new that might actually help the semiconductor industry become profitable is like looking for three wise men and a virgin in Las Vegas, especially when you are going through Semicon,  But last week, on the second day of the conference, I had three people tell me I should go look at what Verigy was showing.  I’ve always been used to seeing testers that took up entire rooms and were hot enough to cook soup in (which I have done but that’s another story).  What I found was both fascinating and yet left me wanting more.  That’s not a bad thing.  What it was was a step in the right direction.  That’s for sure. Here's the link. THIS IS AN UNSPONSORED PODCAST from New Tech Press

Verification IP: Solace for the Common Integration Nightmare?

Language barriers have been problematic since the dawn of civilization. Entire countries have split along spoken language lines, and wars have been fought largely based upon different cultures that have built up around various languages with entirely different concepts.

The culture within semiconductor design and development is no different, except the battles are being fought in the market rather than with physical weapons. Just as in political wars, certain languages will dominate and efficiencies will be achieved through standards, whether created or de facto.

This is particularly true in the verification world. With verification taking roughly 70 percent of chip development time chip designers and developers must use every tool available to cut costs, reduce complexity, and deliver chips to market fast. Making sure all of these tools can communicate is critical.

Much has been written about compatibility of intellectual property (IP) blocks, and the occasional nightmares of getting them to work in a system on chip or embedded design. Far less is known about the interoperability of verification IP (VIP), which is used to verify specific IP blocks to entire systems.  If VIP doesn't work according to plan, it is often because of language incompatibilities.  The definition of language is not just the verification language in which the IP was written for.  It often includes an understanding of the methodology of the chip developer, which combined create a language environment.

Done right, VIP has tangible results. It can help verify some IP or portion of a design and it can help facilitate system-level verification, which is becoming increasingly important as complexity increases in systems on chip. Many developers buy VIP to verify a portion of the design , then continue to use it for system-level verification. In those cases, it is critical for the VIP to be flexible enough for reuse in multiple instances, especially at the bleeding edge of chip development where creators can't anticipate all of the permutations or uses of their IP.

Problems are compounded using VIP from multiple sources, the same as with IP from multiple sources. If the different sources use incompatible methodologies or languages, chaos erupts further slowing the  chip verification process adding big cost overruns into the equation. Incompatibilities may stem from not only from language differences, but how the language is applied, as well as the overall architecture or approach.

Verification Goals

There are two basic challenges in verification. The first is checking what the chip designers created against what the architects intended. The second is making sure that overall design actually works. Both challenges have become much more difficult as the complexity of chips goes up. At advanced process nodes, power and timing are intertwined like a Gordian knot, where multiple power domains intersect with multiple cores being powered on and off at rapid and sometimes unpredictable intervals.

Many of the languages created to solve these problems are far from complete, which drives the mixing of tools, IP and VIP based on multiple languages and methodologies, as well as the creation of new languages.   For example,  real-world progress of the SystemVerilog standard continues to lag behind marketing. 

 

"SystemVerilog is on a strong path to acceptance, even as further extensions and enhancements are being developed, specifically in the methodology and VIP interoperability areas," says Mark Gogolewski, chief technology officer and chief financial officer at Denali Software, which makes memory and protocol VIP. That is particularly true in the United States, but he says the trend is less clear-cut in other parts of the world which is split across SystemVerilog, e, SystemC, Perl, Verilog and VHDL. SystemVerilog was developed under Accellera as an extension to Verilog with donations from member companies and user-driven enhancements.  It was then transferred to IEEE and quickly ratified under its Corporate Standards Program.

The Big Picture

The real challenge is getting VIP blocks to interoperate at the system level, which requires a significant amount of integration. Testing the IP in systems to ensure it performs as intended, as well as working with other components, can be extremely difficult because of the huge amount of data involved in creating chips, and vague parameters provided by many IP vendors. VIP can be provided with IP blocks or developed by independent vendors or the chip developer. In all cases, however, the biggest challenges involve integration at the system level.

 

Verification languages are used to create the test benches for these chips and set up the test cases so that a chip’s functionality is predictable. That job has become so complicated, however, that shortcuts are necessary.  This is where VIP comes into play. VIP can be used for everything from qualifying IP for standard protocols such as PCI Express or USB to creating an entire test environment for much larger portions of a chip.

But VIP brings its own headaches. Unexpected problems can arise if those VIP blocks do not fit into the same methodology, making it more difficult to determine whether the IP they test actually behaves the way it was intended to work when it is put in complex system on chip.

Pinpointing Problems

“The main problem with verification IP is not only to make sure it’s compliant with a protocol, but also that the functionality of the block is what you expected,” says Cyril Spasevski, chief technology officer at Magillem Design Services, based in Paris, France.

 

“The rules from the IP provider are often vague,” Spasevski says. “That means you have to build complex test benches.”

That also means the IP and the VIP not only have to be flexible enough in design to work together, but they must work as planned across a whole system. The more functions that are added into a system on chip, the harder this becomes. And when IP is mixed from multiple companies, levels of compliance with standards or protocols are reduced to relative terms.

“If you have a subsystem that’s ARM-based, don’t even try to mix it with something else,” says Spasevski. “It takes too much time to validate.”

"I’m not sure that it’s really a problem with VIP as a category, but it’s a problem that the right VIP can help solve," said Gogolewski.   "The idea here is that if an engineer did not design a block to be 100% compliant to a protocol, the VIP needs to be flexible enough to accommodate those situations, otherwise the engineer would have to write a VIP from scratch to deal with his deliberately non-compliant block.  So, in effect, the VIP should attempt to enforce the protocol strictly, but allow end-user to dictate exceptions.  Of course, this makes the creation of such VIP much more complicated."

Spasevski contends it is easier to work with IP and VIP from a single vendor, but there are mixed opinions on that front. At the very least, VIP must be built on a platform that is flexible enough to be mapped onto different system methodologies. That sounds simple enough on paper. But many times the creators of IP blocks do not give much detail about the way those IP blocks can be used, and the more pieces of IP that are being purchased the less well understood the interactions. The real value of VIP is its adaptability in these situations.

VIP can be flawed, however. JL Gray, a verification consultant based in Texas, said if an engineer has to verify the VIP, he probably should have written it from the outset because he’s going to have to debug it, anyway.  But he said the reality is that it’s still simpler to do that than have to create everything from scratch.

“A lot more people need to buy VIP than realize it,” Gray said, noting that’s what starts all the integration and protocol issues. “The question everyone has to ask is, ‘Which environment is the base environment?”

New Definitions for Pain

While tools and standards are being developed to reduce the pain of doing verification — everything from functional verification platforms to verification IP — there will always be a high pain threshold. Functionality is added to chips at each new process node, because there is a lot of space to do that and because the development cost of new chips is so high that it’s more economical to put everything on one chip, even if it isn’t all enabled for every device. Adding new functionality increases chip complexity and greatly increases the volume of data that has to be verified.

 

The best that can be done is to try to add some order into this gargantuan task, and that’s exactly what standards groups such as Accellera and OCP-IP are trying to do. In particular, they are trying to slash the amount of work that needs to be done by allowing previously verified blocks to be re-used in new chips—both IP and verification IP (VIP).

Accellera, which recently completed a standard for a property specification language for the formal specification of hardware, is currently working on a standard to make VIP interoperable, a project started in May with more than 80 companies participating.

“What we’re seeing is that designs are collaborative projects,” said Shrenik Mehta, chairman of Accellera. “Sometimes you want to mix and match components.”

The first challenge in that arena is defining the problem and then figuring out a solution that is flexible enough to work across many different environments and methodologies. “What we’re trying to determine is how to take a test bench and make it interoperable,” Mehta said. “But if you give a choice to every engineer you’re going to get different answers from each of them. It’s like one story being written by ten different reporters. None of them is exactly the same.”

But at least part of the difference in the verification world is rethinking of the entire verification process. “This isn’t about the process node,” he said. “It has to do with building a design differently. You need a different discipline.”

OCP-IP Chairman Ian Mackintosh believes that the discipline should focus on interfaces that can ensure re-usability across a wider range of architectures and methodologies.

“The number of things that can be standardized is huge,” said Mackintosh. “But the number of things that are standardized is tiny. Getting broad-based collaboration in this space is not possible at this time, and I don’t see that changing anytime in the near future.”

As a result, OCP-IP is focused on the interfaces rather than what’s behind them. Mackintosh said that if there is fundamental compatibility on language, content and functionality, then what is really needed from standards groups is broad-based verification technology that will support any language or interface they define.

“Essentially, everything you’re doing is verification, whether it’s hardware or software,” Mackintosh said. “If the ideal in verification is reducing the time, then maybe we’re thinking about the problem the wrong way. Verification should always be a large portion of the design cycle. It’s the tools to implement it that have failed. All you’re doing with verification is verifying that what has been done is correct.”

Verification engineers take a different slant on that idea. They contend that the real problem isn’t in the verification itself. The problem arises because they have to wade through masses of data to find what needs to be debugged. The debugging is well understood once the problem is located. That’s one of the reasons the large EDA vendors are busy creating functional verification platforms, which raise the level of abstraction up a notch, and it’s why verification IP is so attractive for debugging specific IP blocks.

Both Accellera and OCP-IP agree that a different approach is needed for building chips in the first place. OCP-IP believes the future work is in software, and that’s why interfaces are so important. “Hardware development has to be more sophisticated, so you need more to re-use,” Mackintosh said. “The bulk of the work will be in software. If you’re product driven, that’s going to be a hard problem to solve. If you’re market driven, it’s easier because you can figure out three years from now that you’re going to need this function at this cost.”  Accellera, meanwhile, is focused on creating the foundation for IP and VIP.

“Today, a lot of IP is standards-based,” said Mehta. “That’s true for the processor, I/O and controllers. A lot of IP is based on industry standards, and you embed that with proprietary IP in a language like C or SystemVerilog. We see the emergence of both of those, which should provide enough interoperability between different VIP blocks. But if what you want to do is add in IP and VIP, you should know whether it is fully verified and tested. That has nothing to do with standards.”

He said the standards will simply make it easier to connect standardized blocks of IP and VIP together. “Time will tell if this is successful,” he said. “ But at least it should ease the pain.”

Getting Over the Verification Hurdly

Verification is likely to remain the single most time-consuming part of developing chips for the foreseeable future, for three reasons.  First, it is getting more expensive to create complicated chips, because more functionality is being added.

 

For example, the immensely popular Apple iPhoneTM is utilizing this kind of built-in future programmability.  Inside Apple retail stores are signs warning customers that iPhones may be permanently damaged by software upgrades if they are unlocked using non-Apple software downloaded from the Internet.  Hackers often ignore that warning but doing so means that they miss "gee-whiz" functions that Apple will introduce later. Apple and other vendors build in features ahead of time because it’s cheaper to do it once, verify the functionality and debug it, rather than trying to build and verify successive iterations of chips with new functions added with new releases of the product . Under the right circumstances adding those dormant functions is very feasible from a real-estate perspective at advanced process nodes.

But verifying more functionality can also greatly increase the time it takes to develop a new chip. Coverage models have to be developed to make sure the chip works and that the intent of the design is carried through the development cycle.

That leads to the second reason why it takes so long to verify a chip. More capabilities mean a more complex design. Prioritization of buses must be set up, for example, to ensure that the phone function on a multifunction cell phone takes precedence over an MP3 player. In addition, when the phone is not in use, it needs to power down to conserve battery power.

Getting these seemingly simple tasks to work in sync, and still have enough battery power left over at the end of the day, is no simple task. Add to that such concerns as signal integrity and maintaining signal strength and it gets even more complicated. Verifying each one of these functions has to be done independently, and as part of a system-wide verification process.

All of this creates data that has to be sifted through to find problems, which leads to reason number three. There simply is too much data to easily identify bugs and debug only that portion of the data that is causing problems. While most of the big EDA companies are working on higher-level languages for functional verification, the tools are still in their infancy.

What's in the Toolbox?

Verification engineers use everything at their disposal to speed up the process. That means simulation of hardware and software, formal verification, and VIP. The VIP is a relative newcomer to the process, where tools are outgrowths of technology developed over the past 10 to 15 years.

 

What’s unusual about VIP is that it can be used to check a specific piece of IP, or to help with overall system-level verification. It’s also incredibly hard to create, because writing VIP requires the VIP creator to understand the IP at least as well, if not better, than the people creating the IP and the overall process in the first place. It’s almost like adding a complementary version of the IP where all parameters that can be imagined are configurable and can be tested.

“For the PCI Express specification, there are 1,000 pages of documentation and it’s all nastily complex,” said Gogolewski. “We have a defined space that includes 500 rules to make sure it’s a valid configuration, then thousands of configurable assertions. For our customer base, building your own VIP is only viable if they’re the very first to market. And, you can’t just be a good programmer or a verification engineer anymore. Now you have to be an expert in both protocols and verification. The number of interfaces is increasing and the complexity of the interfaces is increasing.”

Setting Coverage Models

One of the biggest concerns among verification engineers is the need to establish complete coverage metrics so they know everything that needs to be tested actually gets tested. Shankar Hemmady, a verification engineer at Synopsys, wrote in his recent book on verification that metrics must address code, functional and assertion coverage.  For most designs, this has the complexity and variability of a matrix.

 

Joe Sawicki, vice president and general manager of Mentor Graphics’ design-to-silicon division, said there are three business contexts for chip designers: low power, a shorter opportunity window and a shrinking average selling price. At the same time, designs are increasing in complexity, manufacturing variability must be dealt with throughout the design cycle, and the cost of testing a chip is increasing.

In this context, condensing the time it takes to verify a design, the chip, the software and everything associated with the chip are no longer options. They are requirements—and extremely difficult ones.  That's probably why that even though there is little insight into the actual market size of VIP, a few companies are making money in the business.

KC Rajkumar, EDA and IP analyst for Royal Bank of Canada Capital Markets, said the market for verification IP is one of the few market niches in electronics that has reached maturity and consolidated to a point of equilibrium.  But while there are only a few players (Denali and Synopsys are all that really remain) it is an important technology because as designs become more complex, they becomes increasingly hard to visualize.

VIP, done right, can help verify some IP or portion of a design and it can help facilitate system-level verification.  So whether you buy it or grow it in house it is critical for the VIP to be flexible and robust.