Quantcast
Channel: Cadence Blogs
Viewing all articles
Browse latest Browse all 6681

Verification at DAC17: Shall We Go to the Pantry or the Cookbook?

$
0
0
Tuesday, June 20, 2017, Austin Convention Center. Noon. I’m wilting in the heat. The temperature outside will get up to 95°F today, with 71% humidity. I’m such a delicate California flower, durn it; never have I been so thankful for Texas air conditioning. I’m also over-caffeinated and underslept (having finally arrived at my hotel in Austin at 12:30am), but happy to be here to experience my first DAC. With my first CDNLive being the first blog post, I’ve now been here for six months. It’s been quite a trip so far. I am sitting in the cool, dark ballroom at the convention center, eating sandwiches and chips with about 400 others at today’s Cadence-sponsored luncheon; the theme is verification [1] . (Frank Shirrmeister wanted to call the panel “Making Verification Smart Again”, but was overruled – so it was officially called “Towards Smarter Verification”.) Smoothly moderated by Ann Mutschler of SemiEngineering, the panel consisted of: Christopher Lawless from Intel. Throughout the panel, Chris spoke about optimization being a system-wide challenge, made more difficult by complexity on top of complexity. He said that we can validate IP, but for the system as a whole, we have to take a different approach, focusing on use cases instead of a bottom-up approach. David Lacey from Hewlett-Packard Enterprise. David started off by highlighting the importance of finding the “point of optimization”, and when you find the right method, you should trust your experience and have the data back you up. According to him, the three biggest challenges in verification are that: Debugging takes time Organizing all the data is complicated Analyzing (and mapping) all the data in a cost-effective way Jim Hogan from Vista Ventures, LLC. Jim is so excited about machine learning; I heard him talking enthusiastically about it in several venues. Here, he talked about using machine learning to find the “hidden jewels” along the verification continuum, using methods and tools (he called it a “verification bot”!) we haven’t even thought of yet. Mike Stellfox from Cadence. While addressing the queries of the moderator, panelists, and audience members, Mike highlighted the three keys to smart verification Methodology: you must use the right tool for the right system. Data: you must leverage lots of data to inform your decisions about verification methodologies. Machine learning: With great data comes great possibilities. In a very small nutshell, each of the panelists highlighted the importance of the evolving approaches they’re taking to their verification methodologies, and all of them were excited about using “ machinelearningdeeplearning ” as the ways to achieve their long-term goals. All of the panelists came back to a couple of recurring themes throughout the discussion: Our business is evolving, especially in terms of complexity; the trend in this evolution is towards top-down and shift-left [2] design and processing flows. There is so much dang data out there that we don’t know what to do with it all; using machine learning is the most compelling approach to address the issue. If you haven’t noticed, I tend to think and write in metaphor. I have talked about coding and knitting , what A 2 will look like to the average person , and what AI and language translation have in common. And here at DAC17, I feel another metaphor coming on… Bear with me here. In cooking reality shows, the challenges given to the chef contestants seem to fall into one of two basic categories: Either they are given a limited number of possibly disparate ingredients and they create a new masterpiece in their wee competition kitchens; or they are given perhaps unusual circumstances with an unlimited number of ingredients, and they come up with some new pièce de résistance. In my kitchen, I also have two different approaches to cooking: a.) Scrounging . I look in the fridge and pantry and see what I can create from what is already there; or b.) Planning . I want to make a specific dish, so I plan ahead to get everything that I need to make that thing in the time frame that I need it. The first approach is generally used on weeknight dinners or when a [chocolate] craving hits; the second is generally when we have guests. Both ways work. Either way is valid. They both require a certain amount of skill and planning. They both need fresh ingredients to create an adequate product. Each has their own kind of positives and negatives. But if I’m making something complicated? I get much better results using the latter method. This perfectly illustrates the “bottom-up” and “top-down” approaches to designing and testing complicated systems. The bottom-up approach considers all of the little things that must be designed and tested and debugged, rummaging around in the back of the pantry and behind the condiments. This method allows me to use yogurt if I don’t have enough milk. If I don’t have dark chocolate, cocoa powder and a little oil will do. And depending on my skill and experience (and oh is it vast), the result can be kind of tasty, nourish my kids, and do away with the craving. Or I might have to do without, should the milk be sour or I gamble on some taste combination that… um. Doesn’t work. [Don’t ask my kids about the time I used grape jelly instead of sugar.] The top-down design approach starts with a holistic view of the complicated system, identifying the functionality of the complete project (say, a large family gathering in November). As the process continues, the system is broken into progressively smaller pieces (turkey, stuffing, gravy, pumpkin pie, cranberry sauce…). Keep breaking them down until you can make a shopping list: you’ve identified the smallest pieces that will make up the whole. Finally, when the pieces are completely designed, debugged, washed, mixed, tasted, sauced, roasted, whipped, and verified, they’re hooked up together again to create the complete system as it was originally scoped: Thanksgiving dinner. Cadence offers tools for both methods. That said, considering the complexity and scope of the EDA industry, using a top-down approach will make the most sense moving forward, at least in terms of smart verification. Verify that the sour cream is fresh before you mix it into the chiffon pie, not afterward. Try not to discover that you needed to have brined the turkey for 12 hours before you open the cookbook on Thanksgiving morning. Don't call your mother because you can't find the recipe for cranberry relish. And please help the chef if the kitchen is a mess after the apple pie. [1] Just to orient those of us at the entry level of the EDA world, the verification stage is used to test that the RTL is correct. These days separate engineers run the tests from those that write the RTL. In practice, verification is never finished, and so takes place in parallel with the rest of the chip design, with any errors discovered being incorporated into the design at regular intervals [2] In this case, “shift left” means to find as many bugs as possible early in the design cycle (the “left” of the timeline), since they are much cheaper to fix before a lot of work has been done using the old RTL with the bugs

Viewing all articles
Browse latest Browse all 6681

Trending Articles