In my most recent blog post , I summarized some of the key points from an October presentation at DVClub Silicon Valley by Dave Brownell from Analog Devices. As part of his introductory section on the motivation for the industry to move to portable stimulus, Dave summarized the history of design verification into six phases: Verification? – little or no verification Directed Testing – minimal hand-written tests by the designers Those Who Can, Design; Those Who Can’t, Verify – the rise of verification as a distinct discipline, albeit one sometimes disparaged by designers SystemVerilog/Coverage-Driven Verification (CDV)/Metrics-Driven Verification (MDV) – recognition of the importance of verification and investment in automation Standard Methodology/UVM/VMM/OVM – wide adoption of verification automation and improved verification reuse The Future – with portable stimulus as the “next big thing” I thought that this summary was both accurate and clever, so for today I’d like to expand on the theme a bit with some of my own experiences and observations over the years. When I started my career as a hardware designer, indeed there was not much thought given to verification. We designed gate arrays, miniscule by today’s standards, but still fabricated silicon that could not be changed without a chip turn. You might think that this demanded some level of simulation, but the focus was on team design reviews to check correctness. The architecture team did system-level modeling and testing but I don’t recall any link from the gate-level schematics to the models they used. Despite this, most of our gate arrays ended up going into production on first silicon. The chip turns that did occur were usually due to higher-level problems found by the architecture team rather than elementary errors in the designs themselves. Having eight or ten of one’s peers studying every gate turned out to be a very effective method of verification. My next job moved into the next stage of directed testing, where we designers were able to simulate our gate array schematics using mostly manual test vectors with some very limited automation. This was in a startup with aggressive schedules where peer design reviews were rare, so it was mostly up to us to get our own designs right. Again, there was a higher-level model (RTL written in C) of the system, but no link from our schematics or chip-level simulations. We had a good track record on this project as well, with almost no chip turns. My third major project took a huge leap forward, using Verilog RTL both for simulation and for design via early commercial logic synthesis. Expressing the design in RTL rather than in gate-level schematic form increased efficiency but also reduced the chances for simple errors. The system-level simulation included the RTL of two large (for their time) ASICs, so finally we were verifying the design itself rather than a separate model. Both chips went into production on first silicon so the value of the RTL-based flow was clear. This era also saw the rise of the verification engineer as a distinct role. The system-level simulations, with significant randomization and even some coverage metrics, were not performed by the designers. I saw this trend continue at my customers as I moved first to a silicon IP company and then to my first EDA role at 0-In Design Automation. I observed that many leading chip and system companies had dedicated formal experts as well as simulation teams adopting automated testbench techniques from Verisity and other pioneers. Constrained-random stimulus generation and coverage-driven verification became more widely adopted with the introduction of SystemVerilog, which included standardized assertions for formal analysis as well as simulation. This was not a trivial transition; verification engineers had to learn object-oriented programming and move to a significantly higher level of abstraction. Most of those designers who used to regard verification as a less lofty profession developed a newfound respect for their colleagues and their specialized knowledge. I was on the forefront of this revolution in roles at both Synopsys and Cadence. I believe that I am the only person to have contributed to the Verification Methodology Manual (VMM) plus the first books on both the Open Verification Methodology (OVM) and the Universal Verification Methodology (UVM). The UVM represented the convergence of several approaches as well as a true standard, and it drove advanced testbench techniques fully into the mainstream of functional verification. So that’s where we are today. The UVM has done what it was intended to do: standardize constrained-random and coverage-driven simulation while improving verification reuse. As I have explained in a previous post , we are hitting the wall with UVM in several respects and this is driving the industry momentum toward portable stimulus. I agree with Dave Brownell that this is the next big thing and look forward to helping make it happen in my roles at Cadence and within the Accellera Portable Stimulus Working Group ( PSWG ). I look forward to your support. Tom A. The truth is out there...sometimes it's in a blog.Image may be NSFW.
Clik here to view.
Clik here to view.
