Epilogue: The 2012 Wilson Research Group Functional Verification Study

Wow! I’ve been on the road since August, and finally found a spare moment to get back to this blog. I started this blog series with a prologue that gave a little bit of background on the 2012 Wilson Research Group study. And with this epilogue, I will draw the series to a conclusion with some insight on the study process. Many people are cynics about industry studies in general (and particularly ones based on surveys) and believe that they are inaccurate, unreliable, and biased. However, I believe that the benefit from an industry study is not necessarily the quantitative values that the answers reveal, but the new questions they raise.

With that said, it is important to understand that the 2007 FarWest Research study and the 2010 and 2012 Wilson Research Group studies followed the format of the original 2002 and 2004 Ron Collett International studies, which have certain limitations. For example, the Collett studies were very block-, IP-, and RTL-focused studies. The data that these studies revealed certainly is of value and important, but it doesn’t represent some of the challenges in SoC integration verification and system validation that have recently emerged. In fact, many of the techniques used for block and subsystem verification that these surveys studied (such as constrained-random, functional coverage, general formal property checking) do not scale well to the SoC integration and system-level validation space. I believe that future studies should be expanded to include these emerging challenges.

Another criticism of all the previous studies is that the presented data is aggregated across all market segments—ranging from mil/areo, mobile, consumer, networking, computers, and so forth. Hence, it can be difficult for those in a particular market segment to benchmark themselves against or relate to the study data. This is a valid argument. With our two recent Wilson Research studies, it is possible to filter the data down for presentation to a specific market segment. However, it is not possible with the previous studies.

Nonetheless, there is still value in observing general industry trends between the multiple studies as long as the format of the studies is consistent. Consistency is something we strived for across the studies. For example, whenever possible, we tried to maintain the exact wording of the questions originally used in the Collett studies.

Minimizing Study Biases

When architecting a study, there are three main concerns that must be addressed to ensure valid results: (1) sample validity, (2) non-response bias, (3) stakeholder bias. I’ll briefly review each of these concerns in the following paragraphs, and discuss steps we took to try and minimize the bias concerns.

(1)    Sample validity or undercoverage bias:  To ensure that a study is unbiased, it’s critical that every member of a studied population have an equal chance of participating. An example of a biased study would be when a technical conference surveys its participants. The data might raise some interesting questions, but unfortunately, it doesn’t represent members of the population that was  unable to participant in the conference. They same bias can occur if a journal or online publication simply surveys its subscribers.

A classic example of this problem is the famous Literary Digest poll in the 1936 presidential election, where the magazine surveyed over two million people. This was a huge study for this period in time. The pool (or make-up) of the study was chosen from the magazine’s subscriber list, phone books, and car registrations. However, the problem with this approach was that the study did not represent the actual voter population since it was a luxury to have a subscription to a magazine, or a phone, or a car during The Great Depression. As a result of this biased sample, the poll inaccurately predicted that Republican Alf Landon versus the Democrat Franklin Roosevelt would win the 1936 presidential election.

For the 2012 Wilson Research Group study, we carefully chose a broad set of lists that, when combined, represented all regions of the world and all electronic design market segments. We reviewed the participant results in terms of market segments to ensure no segment or region representation was inadvertently excluded or under-represented.

(2)    Non-response bias: Non-response bias in surveys occurs when a randomly sampled individual cannot be contacted or refuses to participate in a survey. For example, spam and unsolicited mail filters remove an individual from the possibility of receiving an invitation to participate in a survey, which can bias results. It is important to validate sufficient responses occurred across all list that make up the study pool. Hence, we reviewed the final results to ensure that no single list of respondents that made up the participant pool dominated the final results.

Another potential non-response bias is due to lack of language translation. In fact, we learned this during the 2010 Wilson Research Group study. The study generally had good representation from all regions of the world, with the exception of an initially very poor level of participation from Japan. To solve this problem, we took two actions: (1) we translated both the invitation and the survey into Japanese, (2) we acquired additional engineering lists directly from Japan to augment our existing survey invitation list. This resulted in a balanced representation from Japan. Based on that experience, we took the same approach to solve the language problem for the 2012 study.

(3)    Stakeholder bias: Stakeholder bias occurs when someone who has a vested interest in survey results can complete an online survey multiple times and urge others to complete the survey in order to influence the results. To address this problem, a special code was generated for each study participation invitation that was sent out. The code could only be used once to fill out the survey, preventing someone from taking the study multiple times, or sharing the invitation with someone else.

2010 Study Bias

After analyzing the results from the 2012 study we are confident that the study was balanced across market segments and regions of the world, which was our goal. However, while architecting the 2012 study, we did discover a non-response bias associated with the 2010 study. Although multiple lists across multiple market segments and across multiple regions of the world were used during the 2010 study, we discovered that a single list dominated the responses, which consisted of participants who worked on more advanced projects and whose functional verification processes tend to be mature. For example, as a result of this bias, if you look at Figure 3 concerning languages used to create testbenches, the industry-wide adoption of SystemVerilog is likely less than what was shown for 2010. This means that the industry adoption between 2010 and 2012 would have increased slightly more than indicated, and the growth between 2007 and 2010 would have been slightly less.

The 2007 study, like the 2012 study, was well balance and did not exhibit the non-response bias previously described for the 2010 data. Hence, we have confidence in talking about general industry trends between 2007 and 2012. The 2010 data can still be useful as a reference point during discussion if you keep in mind that it represents a more process-mature segment of the population.

Our plan is to commission a new study in 2014. At that point, we should have sufficient data to start showing trends in the FPGA space (something we have not been able to do yet) and have a clearer picture of emerging trends in the non-FPGA space. However, as previously stated, the emerging challenges today are occurring in the SoC integration verification and system-level validation space. We will either reduce the scope of our existing IP/RTL-focused study to make room for new questions in these spaces, or conduct a separate, rigorous study for these new emerging challenges.

Final word—I hope the data presented in this set of blogs has provided some insight on general trends. But more importantly, I hope it has inspired you with questions about your own processes that you might want to investigate.

Post Author

Posted November 28th, 2013, by

Post Tags

Post Comments

No Comments

About Verification Horizons BLOG

This blog will provide an online forum to provide weekly updates on concepts, values, standards, methodologies and examples to assist with the understanding of what advanced functional verification technologies can do and how to most effectively apply them. We're looking forward to your comments and suggestions on the posts to make this a useful tool. Verification Horizons BLOG

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@jhupcey Tweets

  • Loading tweets...

Comments

Add Your Comment

Archives

November 2014
  • SystemVerilog Testbench Debug – Are we having fun yet?
  • ARM® Techcon Paper Report: How Microsoft Saved 4 Man-Months Meeting Their Coverage Closure Goals Using Automated Verification Management & Formal Apps
  • Preparing for the Perfect Storm with New-School Verification Techniques
  • On-Demand Webinar: UVM Sequences in Depth
  • October 2014
  • DVCon India: A Smashing Hit!
  • September 2014
  • Portable and Productive Test Creation with Graph-Based Stimulus
  • Supporting A Season of Learning
  • August 2014
  • DVCon Goes Global!
  • Better Late Than Never: Magical Verification Horizons DAC Edition
  • July 2014
  • Accellera Approves UVM 1.2
  • May 2014
  • Getting More Value from your Stimulus Constraints
  • The FPGA Verification Window Is Open
  • April 2014
  • UVM DVCon 2014 Tutorial Video Online
  • Mentor Enterprise Verification Platform Debuts
  • March 2014
  • New Verification Academy ABV Course
  • DVCon 2014 Issue of Verification Horizons Now Available
  • February 2014
  • DVCon–The FREE Side
  • More DVCon–More Mentor Tutorials!
  • UVM 1.2: Open Public Review
  • DVCon 2014: Standards on Display
  • Just because FPGAs are programmable doesn’t mean verification is dead
  • January 2014
  • Managing Verification Coverage Information
  • November 2013
  • Epilogue: The 2012 Wilson Research Group Functional Verification Study
  • New Verification Horizons Issue Available
  • October 2013
  • Happy Halloween from ARM TechCon
  • IEEE Standards Association Symposium on EDA Interoperability
  • STMicroelectronics: Simulation + Emulation = Verification Success
  • September 2013
  • A Decade of SystemVerilog: Unifying Design and Verification?
  • Part 12: The 2012 Wilson Research Group Functional Verification Study
  • August 2013
  • Part 11: The 2012 Wilson Research Group Functional Verification Study
  • Part 10: The 2012 Wilson Research Group Functional Verification Study
  • Part 9: The 2012 Wilson Research Group Functional Verification Study
  • Part 8: The 2012 Wilson Research Group Functional Verification Study
  • July 2013
  • Part 7: The 2012 Wilson Research Group Functional Verification Study
  • Walking in the Desert or Drinking from a Fire Hose?
  • Part 6: The 2012 Wilson Research Group Functional Verification Study
  • A Short Class on SystemVerilog Classes
  • Part 5: The 2012 Wilson Research Group Functional Verification Study
  • Part 4: The 2012 Wilson Research Group Functional Verification Study
  • June 2013
  • Part 3: The 2012 Wilson Research Group Functional Verification Study
  • Part 2: The 2012 Wilson Research Group Functional Verification Study
  • May 2013
  • Texas-Sized DAC Edition of Verification Horizons Now Up on Verification Academy
  • IEEE 1801™-2013 UPF Standard Is Published
  • Part 1: The 2012 Wilson Research Group Functional Verification Study
  • What’s the deal with those wire’s and reg’s in Verilog
  • April 2013
  • Getting AMP’ed Up on the IEEE Low-Power Standard
  • Prologue: The 2012 Wilson Research Group Functional Verification Study
  • March 2013
  • Even More UVM Debug in Questa 10.2
  • IEEE Approves New Low Power Standard
  • February 2013
  • Verification Horizons DVCon Issue Now Available
  • Get your IEEE 1800-2012 SystemVerilog LRM at no charge
  • IEEE 1800™-2012 SystemVerilog Standard Is Published
  • See You at DVCon 2013!
  • Get Ready for SystemVerilog 2012
  • January 2013
  • VHDL Update Comes to Verification Academy!
  • December 2012
  • IEEE Approves Revised SystemVerilog Standard
  • November 2012
  • Coverage Cookbook Debuts
  • October 2012
  • IoT: Internet of Things
  • Check out the October, 2012 Verification Horizons
  • Improving simulation results with formal-based technology
  • Introducing “Verification Academy 2.0”
  • September 2012
  • OVM Gets Connected
  • August 2012
  • OpenStand & EDA Standardization
  • July 2012
  • Synthesizing Hardware Assertions and Post-Silicon Debug
  • Virtual Emulation for Debugging
  • Verification Academy: Up Close & Personal
  • SystemC Standardization Cycle Completes
  • Verification Standards Take Another Step Forward
  • New UVM Recipe of the Month: Scoreboarding in UVM
  • June 2012
  • Intelligent Testbench Automation – Catching on Fast
  • May 2012
  • Two Articles You Need to Check Out
  • Off to DAC!
  • Dave Rich Featured on EEWeb
  • March 2012
  • How Did I Get Here?
  • February 2012
  • Expanding the Verification Academy!
  • Get on the Fast Track to Advanced Verification with UVM Express
  • Introducing UVM Connect
  • Tornado Alert!!!
  • UVM: Some Thoughts Before DVCon
  • UVM™ at DVCon 2012
  • January 2012
  • SystemC 2011 Standard Published
  • Verification solutions that help reduce bug cost
  • December 2011
  • Instant Replay for Debugging SoC Level Simulations
  • 2011 IEEE Design Automation Standards Awards
  • November 2011
  • Getting started with the UVM – Using the Register Modeling package
  • TLM Becomes an IEEE Standard
  • October 2011
  • Worlds Standards Day 2011
  • VHS or Betamax?
  • Verification Issues Take Center Stage
  • September 2011
  • New UVM Recipe-of-the-Month: Sequence Layering
  • July 2011
  • Combining Intelligent Testbench Automation with Constrained Random Testing
  • Going from “Standards Development” to “Standards Practice”
  • Verification Academy Now Includes OVMWorld Content
  • June 2011
  • Intelligent Testbench Automation Delivers 10X to 100X Faster Functional Verification
  • Part 9: The 2010 Wilson Research Group Functional Verification Study
  • Verification Horizons DAC Issue Now Available Online
  • Accellera & OSCI Unite
  • The IEEE’s Most Popular EDA Standards
  • UVM Register Kit Available for OVM 2.1.2
  • May 2011
  • Part 8: The 2010 Wilson Research Group Functional Verification Study
  • Getting Your Standards Update @ DAC 2011
  • April 2011
  • User-2-User’s Functional Verification Track
  • Part 7: The 2010 Wilson Research Group Functional Verification Study
  • Part 6: The 2010 Wilson Research Group Functional Verification Study
  • SystemC Day 2011 Videos Available Now
  • Part 5: The 2010 Wilson Research Group Functional Verification Study
  • Part 4: The 2010 Wilson Research Group Functional Verification Study
  • Part 3: The 2010 Wilson Research Group Functional Verification Study
  • March 2011
  • Part 2: The 2010 Wilson Research Group Functional Verification Study
  • Part 1: The 2010 Wilson Research Group Functional Verification Study
  • Prologue: The 2010 Wilson Research Group Functional Verification Study
  • Language Transitions: The Dawning of Age of Aquarius
  • Using the UVM libraries with Questa
  • February 2011
  • DVCon: The Present and the Future
  • Free at Last! UVM1.0 is Here!
  • Parameterized Classes, Static Members and the Factory Macros
  • IEEE Standards in India
  • January 2011
  • Accellera Approves New Co-Emulation Standard
  • December 2010
  • New Verification Horizons: Methodologies Don’t Have to be Scary
  • The Survey Says: Verification Planning
  • October 2010
  • Towards UVM Register Package Interoperability
  • IEC’s 47th General Assembly Meeting Opens
  • UVM: Giving Users What They Want
  • September 2010
  • UVM Takes Shape in the Accellera VIP-TSC
  • Accellera VIP-TSC Selects RAL for UVM 1.0 Register Package
  • OVM Cookbook Available from OVMWorld.org
  • UVM Register Package Candidate News
  • August 2010
  • Redefining Verification Performance (Part 2)
  • July 2010
  • Making formal property checking easy to use
  • Redefining Verification Performance (Part 1)
  • SystemVerilog Coding Guidelines: Package import versus `include
  • June 2010
  • The reports of OVM’s death are greatly exaggerated (with apologies to Mark Twain)
  • New Verification Academy Advanced OVM (&UVM) Module
  • OVM/UVM @DAC: The Dog That Didn’t Bark
  • DAC: Day 1; An Ode to an Old Friend
  • UVM: Joint Statement Issued by Mentor, Cadence & Synopsys
  • Static Verification
  • OVM/UVM at DAC 2010
  • DAC Panel: Bridging Pre-Silicon Verification and Post-Silicon Validation
  • Accellera’s DAC Breakfast & Panel Discussion
  • May 2010
  • Easier UVM Testbench Construction – UVM Sequence Layering
  • North American SystemC User Group (NASCUG) Meeting at DAC
  • An Extension to UVM: The UVM Container
  • UVM Register Package 2.0 Available for Download
  • Accellera’s OVM: Omnimodus Verification Methodology
  • High-Level Design Validation and Test (HLDVT) 2010
  • New OVM Sequence Layering Package – For Easier Tests
  • OVM 2.0 Register Package Released
  • OVM Extensions for Testbench Reuse
  • April 2010
  • SystemC Day Videos from DVCon Available Now
  • On Committees and Motivations
  • The Final Signatures (the meeting during the meeting)
  • UVM Adoption: Go Native-UVM or use OVM Compatibility Kit?
  • UVM-EA (Early Adopter) Starter Kit Available for Download
  • Accellera Adopts OVM 2.1.1 for its Universal Verification Methodology (UVM)
  • March 2010
  • The Art of Deprecation
  • OVM 2.1.1 Now Ready for Download
  • February 2010 Verification Horizons Newsletter Now Available
  • IEEE Standards Meetings in India
  • February 2010
  • I Do It …
  • SystemVerilog: A time for change? Maybe not.
  • Partners Offer Support for OVM 1.0 Register Package
  • SystemC Day at DVCon
  • OVM/VMM Interoperability Kit: It’s Ready!
  • January 2010
  • Three Perfect 10’s
  • OVM 1.0 Register Package Released
  • Accellera Adopts OVM
  • SystemC (IEEE Std. 1666™) Comes to YouTube
  • Debugging requires a multifaceted solution
  • December 2009
  • A Cliffhanger ABV Seminar, Jan 19, Santa Clara, CA
  • Truth in Labeling: VMM2.0
  • IEEE Std. 1800™-2009 (SystemVerilog) Ready for Purchase & Download
  • December Verification Horizons Issue Out
  • Evolution is a tinkerer
  • It Is Better to Give than It Is to Receive
  • Zombie Alert! (Can the CEDA DTC “User Voice” Be Heard When They Won’t Let You Listen)
  • DVCon is Just Around the Corner
  • The “Standards Corner” Becomes a Blog
  • I Am Honored to Honor
  • IEEE Standards Association Awards Ceremony
  • ABV and being from Missouri…
  • Time hogs, blogs, and evolving underdogs…
  • Full House – and this is no gamble!
  • Welcome to the Verification Horizons Blog!
  • September 2009
  • SystemVerilog: The finer details of $unit versus $root.
  • SystemVerilog Coding Guidelines
  • July 2009
  • The Language versus The Methodology
  • May 2009
  • Are Program Blocks Necessary?