Evolution is a tinkerer

I was recently quoted in an EDA DesignLine blog as saying that “it is a myth that ABV is a mainstream technology.” Actually, the original quote comes from an extended abstract I wrote for an invited tutorial at Computer-Aided Engineering (CAV) in 2008 titled Assertion-Based Verification: Industry Myths to Realities. My claim is based on the Farwest Research 2007 study (comissioned and sponsored by Mentor Graphics) that found approximately 37 percent of the industry had adopted simulation-based ABV techniques, and 19 percent of the industry had adopted formal ABV techniques. Now, those of you who know me know that I am an optimist—and therefore the statistics from these industry studies reveal a wonderful opportunity for design projects to improve themselves. ;-) However, the problem of adopting advanced functional verification techniques is not just limited to ABV.  For example, the study revealed that only 48 percent of the industry performs code coverage. Let me repeat that, I said code coverage, not even something as exotic as functional coverage (which was observed to be about 40 percent of the industry)! Furthermore, only about 41 percent of the industry has adopted constrained random verification. Advanced Functional Verification Adoption

Now, we could argue about which is the best approach to measuring coverage or achieving functional closure, but that is not the point. The question is, how do we as an industry evolve our verification capabilities beyond 1990 best practices?

In my mind, one of the first steps in the evolutionary process is to define a model for assessing an organization’s existing verification capabilities. For a number of years I’ve studied variants of the Capability Maturity Model Integration (that is, CMM and CMMI) as a possible tool for assessment. After numerous discussions with many industry thought leaders and experts, I’ve concluded that the CMM is really not an ideal model for assessing hardware organizations. Nonetheless, there is certainly a lot we can learn from observing CMM applied on actual software projects.

For those of you unfamiliar with the CMM, its origins date back to the early 1980’s. During this period, the United States Department of Defense established the Software Engineering Institute at Carnegie Mellon University in response to a perceived software development crisis related to escalating development costs and quality problems. One of the key contributions resulting from this effort was the published work titled The Capability Maturity Model: Guidelines for Improving the Software Process. The CMM is a framework for assessing effective software process, which provides an evolutionary path for improving an organization’s processes from ad hoc, immature processes to developed, mature, disciplined ones.

Fundamental to maturing an organization’s process capabilities is an investment in developing skills within the organization. To assist in this effort, we have launched an ambitious project to evolve an organization’s advanced functional verification skills through the Verification Academy. In fact, we have an introductory module titled Evolving Capabilities that provides my first attempt at a simple assessment model. I anticipate that this model will itself evolve over time as I receive valuable feedback on refining and improving it. Nonetheless, the simple Evolving Capabilities model as it exists today provides a wonderful framework for organizing multiple modules focused on evolving an organization’s advanced functional verification capabilities.

I realize that evolving technical skills is obviously only part of the solution to successfully advancing the industry’s functional verification capabilities. Yet, education is an important step. I’d be interested in hearing your thoughts on the subject. Why do you think the industry as a whole has been slow in its adoption of advanced functional verification techniques? What can be done to improve this situation?

Post Author

Posted December 14th, 2009, by

Post Tags

,

Post Comments

5 Comments

About Verification Horizons BLOG

This blog will provide an online forum to provide weekly updates on concepts, values, standards, methodologies and examples to assist with the understanding of what advanced functional verification technologies can do and how to most effectively apply them. We're looking forward to your comments and suggestions on the posts to make this a useful tool. Verification Horizons BLOG

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@jhupcey Tweets

  • Loading tweets...

Comments

5 comments on this post | ↓ Add Your Own

Commented on December 22, 2009 at 4:46 am
By Mohamed A. Salem

Thanks Harry for the excellent post. I am seeing that one of the main obstacles for adoption for advanced verification techniques , is the HW/SW boundary in terms of designers and verification engineers looking at each other through a barrier. Designers consider verification a SW job , and verification engineers consider design as a HW job. Practically speaking to get the best out of the process , there should be no barriers , and there should be seamless smooth diversity between design/verification , HW/SW. if both campaigns believe that nowadays the wall between HW/SW has been vanished , we would see smooth and fast pace adoption. it is amazing to see that the winners who are working on breaking that wall!

Commented on January 6, 2010 at 5:20 pm
By Thomas Bollaert

This is not only an EDA issue! In 1968, Dick Fosbury masterfully demonstrated his new high-jump technique by winning the Olympic gold. 12 years later, in 1980, nearly 1 out 5 Olympic finalists still used the old straddle technique – and of course lost… Quite an inspiring story: http://bit.ly/4ZWl5E

Commented on January 7, 2010 at 2:35 pm
By Suresh Rajgopal

Adding to Mohamed’s comments – educating and changing the mindset of Design and Verification Engineers is crucial. The complexity of the verification task in designs today requires that it be taken seriously, which means verification engineers with the right skill sets, not designers writing tests after their done with RTL. But this does not absolve the verification eng from understanding the HW/Design. Likewise the Design Engr now needs to respect constraints imposed by the (more complex) verification methodology – a minor design change on an interface protocol may cause a lot more headache to the verification engineer.

Finally, one aspect that the constrained random verification methodologies need to work on, is the ability to quickly bring up a verification infra-structure for early debug, that can later be extended for regressions. It just takes way too much time today to be of any use for the designer at the beginning to help flush bugs

Commented on January 11, 2010 at 2:57 am
By Donald Paddy' McCarthy

Remembering back to the old Daisy days, one of their iterations of simulator was dynamic in nature, allowing quick edits to a design and then continuing the simulation. I don’t think we should have a dynamic DUT in a simulation, but I do think we might gain by having a dynamically interpreted verification language and a REPL (http://en.wikipedia.org/wiki/Read-eval-print_loop) or interactive shell, together with something like doctest (http://en.wikipedia.org/wiki/Doctest) that will lower the barrier to people doing any systematic verification at all.

People for which programming is not their speciality seem to find it more productive to use dynamic scripting environments rather than statically compiled language environments (c.f. Matlab, SAGE, biopython, bioperl, …).

Just as Ruby-on-Rails and Django made it much easier to do most web sites; we should look to dynamic languages to provide similar gains in allowing designers an easier environment for verification.

Many Verification techniques rely on regression runs of many runs of very similar simulations or properties, etc. Whilst it is easy to buy and configure a compute farm to run multiple jobs, Vendors and customers need to work together in making licensing such compute farms realistic. If you verify, you will need a compute farm!

– Paddy.

Commented on January 11, 2010 at 7:55 am
By Geoff Barrett

What if coverage closure is an evolutionary dead end? What proportion of failed projects have carefully measured coverage (and still implemented the wrong thing at the wrong time)? It seems the majority of the survivors have made it through the last 20 years without adopting the “new” techniques.

Do you remember what we did before specman turned up? I and many others used to spend time identifying the “resources” in the architecture and implementation of a chip, then mapping out the transactions that targeted each of those resources and finally constructing tests that stressed each resource by generating lots of transactions that targeted each one individually. It was a lot less effort than constructing complicated metrics and found just as many if not more bugs. It wasn’t so good for the EDA companies because they couldn’t sell tools and courses to support the methodology.

So is this evolution benefitting the design industry or the EDA industry? And what data can you bring to bear to support your answer?

Geoff

Add Your Comment

Archives

October 2014
  • DVCon India: A Smashing Hit!
  • September 2014
  • Portable and Productive Test Creation with Graph-Based Stimulus
  • Supporting A Season of Learning
  • August 2014
  • DVCon Goes Global!
  • Better Late Than Never: Magical Verification Horizons DAC Edition
  • July 2014
  • Accellera Approves UVM 1.2
  • May 2014
  • Getting More Value from your Stimulus Constraints
  • The FPGA Verification Window Is Open
  • April 2014
  • UVM DVCon 2014 Tutorial Video Online
  • Mentor Enterprise Verification Platform Debuts
  • March 2014
  • New Verification Academy ABV Course
  • DVCon 2014 Issue of Verification Horizons Now Available
  • February 2014
  • DVCon–The FREE Side
  • More DVCon–More Mentor Tutorials!
  • UVM 1.2: Open Public Review
  • DVCon 2014: Standards on Display
  • Just because FPGAs are programmable doesn’t mean verification is dead
  • January 2014
  • Managing Verification Coverage Information
  • November 2013
  • Epilogue: The 2012 Wilson Research Group Functional Verification Study
  • New Verification Horizons Issue Available
  • October 2013
  • Happy Halloween from ARM TechCon
  • IEEE Standards Association Symposium on EDA Interoperability
  • STMicroelectronics: Simulation + Emulation = Verification Success
  • September 2013
  • A Decade of SystemVerilog: Unifying Design and Verification?
  • Part 12: The 2012 Wilson Research Group Functional Verification Study
  • August 2013
  • Part 11: The 2012 Wilson Research Group Functional Verification Study
  • Part 10: The 2012 Wilson Research Group Functional Verification Study
  • Part 9: The 2012 Wilson Research Group Functional Verification Study
  • Part 8: The 2012 Wilson Research Group Functional Verification Study
  • July 2013
  • Part 7: The 2012 Wilson Research Group Functional Verification Study
  • Walking in the Desert or Drinking from a Fire Hose?
  • Part 6: The 2012 Wilson Research Group Functional Verification Study
  • A Short Class on SystemVerilog Classes
  • Part 5: The 2012 Wilson Research Group Functional Verification Study
  • Part 4: The 2012 Wilson Research Group Functional Verification Study
  • June 2013
  • Part 3: The 2012 Wilson Research Group Functional Verification Study
  • Part 2: The 2012 Wilson Research Group Functional Verification Study
  • May 2013
  • Texas-Sized DAC Edition of Verification Horizons Now Up on Verification Academy
  • IEEE 1801™-2013 UPF Standard Is Published
  • Part 1: The 2012 Wilson Research Group Functional Verification Study
  • What’s the deal with those wire’s and reg’s in Verilog
  • April 2013
  • Getting AMP’ed Up on the IEEE Low-Power Standard
  • Prologue: The 2012 Wilson Research Group Functional Verification Study
  • March 2013
  • Even More UVM Debug in Questa 10.2
  • IEEE Approves New Low Power Standard
  • February 2013
  • Verification Horizons DVCon Issue Now Available
  • Get your IEEE 1800-2012 SystemVerilog LRM at no charge
  • IEEE 1800™-2012 SystemVerilog Standard Is Published
  • See You at DVCon 2013!
  • Get Ready for SystemVerilog 2012
  • January 2013
  • VHDL Update Comes to Verification Academy!
  • December 2012
  • IEEE Approves Revised SystemVerilog Standard
  • November 2012
  • Coverage Cookbook Debuts
  • October 2012
  • IoT: Internet of Things
  • Check out the October, 2012 Verification Horizons
  • Improving simulation results with formal-based technology
  • Introducing “Verification Academy 2.0”
  • September 2012
  • OVM Gets Connected
  • August 2012
  • OpenStand & EDA Standardization
  • July 2012
  • Synthesizing Hardware Assertions and Post-Silicon Debug
  • Virtual Emulation for Debugging
  • Verification Academy: Up Close & Personal
  • SystemC Standardization Cycle Completes
  • Verification Standards Take Another Step Forward
  • New UVM Recipe of the Month: Scoreboarding in UVM
  • June 2012
  • Intelligent Testbench Automation – Catching on Fast
  • May 2012
  • Two Articles You Need to Check Out
  • Off to DAC!
  • Dave Rich Featured on EEWeb
  • March 2012
  • How Did I Get Here?
  • February 2012
  • Expanding the Verification Academy!
  • Get on the Fast Track to Advanced Verification with UVM Express
  • Introducing UVM Connect
  • Tornado Alert!!!
  • UVM: Some Thoughts Before DVCon
  • UVM™ at DVCon 2012
  • January 2012
  • SystemC 2011 Standard Published
  • Verification solutions that help reduce bug cost
  • December 2011
  • Instant Replay for Debugging SoC Level Simulations
  • 2011 IEEE Design Automation Standards Awards
  • November 2011
  • Getting started with the UVM – Using the Register Modeling package
  • TLM Becomes an IEEE Standard
  • October 2011
  • Worlds Standards Day 2011
  • VHS or Betamax?
  • Verification Issues Take Center Stage
  • September 2011
  • New UVM Recipe-of-the-Month: Sequence Layering
  • July 2011
  • Combining Intelligent Testbench Automation with Constrained Random Testing
  • Going from “Standards Development” to “Standards Practice”
  • Verification Academy Now Includes OVMWorld Content
  • June 2011
  • Intelligent Testbench Automation Delivers 10X to 100X Faster Functional Verification
  • Part 9: The 2010 Wilson Research Group Functional Verification Study
  • Verification Horizons DAC Issue Now Available Online
  • Accellera & OSCI Unite
  • The IEEE’s Most Popular EDA Standards
  • UVM Register Kit Available for OVM 2.1.2
  • May 2011
  • Part 8: The 2010 Wilson Research Group Functional Verification Study
  • Getting Your Standards Update @ DAC 2011
  • April 2011
  • User-2-User’s Functional Verification Track
  • Part 7: The 2010 Wilson Research Group Functional Verification Study
  • Part 6: The 2010 Wilson Research Group Functional Verification Study
  • SystemC Day 2011 Videos Available Now
  • Part 5: The 2010 Wilson Research Group Functional Verification Study
  • Part 4: The 2010 Wilson Research Group Functional Verification Study
  • Part 3: The 2010 Wilson Research Group Functional Verification Study
  • March 2011
  • Part 2: The 2010 Wilson Research Group Functional Verification Study
  • Part 1: The 2010 Wilson Research Group Functional Verification Study
  • Prologue: The 2010 Wilson Research Group Functional Verification Study
  • Language Transitions: The Dawning of Age of Aquarius
  • Using the UVM libraries with Questa
  • February 2011
  • DVCon: The Present and the Future
  • Free at Last! UVM1.0 is Here!
  • Parameterized Classes, Static Members and the Factory Macros
  • IEEE Standards in India
  • January 2011
  • Accellera Approves New Co-Emulation Standard
  • December 2010
  • New Verification Horizons: Methodologies Don’t Have to be Scary
  • The Survey Says: Verification Planning
  • October 2010
  • Towards UVM Register Package Interoperability
  • IEC’s 47th General Assembly Meeting Opens
  • UVM: Giving Users What They Want
  • September 2010
  • UVM Takes Shape in the Accellera VIP-TSC
  • Accellera VIP-TSC Selects RAL for UVM 1.0 Register Package
  • OVM Cookbook Available from OVMWorld.org
  • UVM Register Package Candidate News
  • August 2010
  • Redefining Verification Performance (Part 2)
  • July 2010
  • Making formal property checking easy to use
  • Redefining Verification Performance (Part 1)
  • SystemVerilog Coding Guidelines: Package import versus `include
  • June 2010
  • The reports of OVM’s death are greatly exaggerated (with apologies to Mark Twain)
  • New Verification Academy Advanced OVM (&UVM) Module
  • OVM/UVM @DAC: The Dog That Didn’t Bark
  • DAC: Day 1; An Ode to an Old Friend
  • UVM: Joint Statement Issued by Mentor, Cadence & Synopsys
  • Static Verification
  • OVM/UVM at DAC 2010
  • DAC Panel: Bridging Pre-Silicon Verification and Post-Silicon Validation
  • Accellera’s DAC Breakfast & Panel Discussion
  • May 2010
  • Easier UVM Testbench Construction – UVM Sequence Layering
  • North American SystemC User Group (NASCUG) Meeting at DAC
  • An Extension to UVM: The UVM Container
  • UVM Register Package 2.0 Available for Download
  • Accellera’s OVM: Omnimodus Verification Methodology
  • High-Level Design Validation and Test (HLDVT) 2010
  • New OVM Sequence Layering Package – For Easier Tests
  • OVM 2.0 Register Package Released
  • OVM Extensions for Testbench Reuse
  • April 2010
  • SystemC Day Videos from DVCon Available Now
  • On Committees and Motivations
  • The Final Signatures (the meeting during the meeting)
  • UVM Adoption: Go Native-UVM or use OVM Compatibility Kit?
  • UVM-EA (Early Adopter) Starter Kit Available for Download
  • Accellera Adopts OVM 2.1.1 for its Universal Verification Methodology (UVM)
  • March 2010
  • The Art of Deprecation
  • OVM 2.1.1 Now Ready for Download
  • February 2010 Verification Horizons Newsletter Now Available
  • IEEE Standards Meetings in India
  • February 2010
  • I Do It …
  • SystemVerilog: A time for change? Maybe not.
  • Partners Offer Support for OVM 1.0 Register Package
  • SystemC Day at DVCon
  • OVM/VMM Interoperability Kit: It’s Ready!
  • January 2010
  • Three Perfect 10’s
  • OVM 1.0 Register Package Released
  • Accellera Adopts OVM
  • SystemC (IEEE Std. 1666™) Comes to YouTube
  • Debugging requires a multifaceted solution
  • December 2009
  • A Cliffhanger ABV Seminar, Jan 19, Santa Clara, CA
  • Truth in Labeling: VMM2.0
  • IEEE Std. 1800™-2009 (SystemVerilog) Ready for Purchase & Download
  • December Verification Horizons Issue Out
  • Evolution is a tinkerer
  • It Is Better to Give than It Is to Receive
  • Zombie Alert! (Can the CEDA DTC “User Voice” Be Heard When They Won’t Let You Listen)
  • DVCon is Just Around the Corner
  • The “Standards Corner” Becomes a Blog
  • I Am Honored to Honor
  • IEEE Standards Association Awards Ceremony
  • ABV and being from Missouri…
  • Time hogs, blogs, and evolving underdogs…
  • Full House – and this is no gamble!
  • Welcome to the Verification Horizons Blog!
  • September 2009
  • SystemVerilog: The finer details of $unit versus $root.
  • SystemVerilog Coding Guidelines
  • July 2009
  • The Language versus The Methodology
  • May 2009
  • Are Program Blocks Necessary?