Verification Horizons BLOG

This blog will provide an online forum to provide weekly updates on concepts, values, standards, methodologies and examples to assist with the understanding of what advanced functional verification technologies can do and how to most effectively apply them. We're looking forward to your comments and suggestions on the posts to make this a useful tool.

10 April, 2014

Its always fun to take the wraps off of solutions we have been hard at work developing.  The global team of Mentor Graphics engineers have spent considerable time and energy to bring the next level of SoC design and verification productivity to what seems to be a never ending response to Moore’s Law.  As silicon feature sizes get smaller, design sizes get larger and the verification problem mushrooms.  But you know that.  These changes are the constants that drive the need for continued innovation.  Our next level of innovation for design verification is embodied in the Mentor Enterprise Verification Platform (EVP) which we recently announced.

Gary Smith recently published Keeping Up with the Emulation Market, and lays out the fact that verification platforms are unifying with emulation now a pivotal element, not just for microprocessor design success, but for Multi-Platform Based SoC design success as well.  The need to bring software debug into the loop with early hardware concepts is a verification challenge that must be supported as well.  Pradeep Chakraborty reported on the point made by Anil Gupta of Applied Micro at the UVM 1.2 Day in Bangalore where Anil implored “Think about the block, the subsystem and the top.”  The point made was software is often overlooked or under tested prior to committing to hardware implementation implying that our focus on UVM leaves us to verify no higher than where UVM takes us – and that is not the “top” of the SoC that mandates software be part of the verification plan.

Path to Success

With the Mentor EVP, we do address these issues.  We bring simulation and emulation together in a unified platform.  Software debug on conceptual hardware is supported to address verification at the “top.”  And even as Gary’s report concludes with a wonder about how easy access to emulation will be supported for the masses.  That too is solved in the Mentor EVP using VirtuaLAB that can be hosted in data centers along with the emulator vs. complex, one-off lab setups that lock an emulator to a design and lock out your global team of software developers from collaborating.  The Mentor EVP moves to emulation for the masses in a 24×7 world.

With big designs comes big data and complex debug tasks.  These complex debug tasks are all easily handled by the new Mentor Visualizer Debug Environment that has native UVM and SystemVerilog class-based debug capabilities and low-power UPF debug support to easily pinpoint design errors. All of this works in both interactive and post-simulation modes for simulation and emulation.  To keep the software team productive, and get to SoC signoff sooner, the innovative and new Veloce OS3 global emulation resourcing technology moves software debug think-time offline to Mentor’s Codelink software debug tool.

And there’s more!  But I’ll leave that for you to discover.  When you have time, visit us here, to learn more about the Mentor Enterprise Verification Platform.

Path to Standards

As the move to support Multi-Platform Based SoC evolves, so do the standards that underpin it.  And as I’ve reported on the comments of others in this blog – and the understanding from our experience that UVM can only go so far in Multi-Platform Based SoC verification – we concluded the time is right for the industry to explore the need for new standards.

We announced at DVCon 2014 an offer to take our graph-based test specification into an Accellera committee to help move beyond the limitations today’s standards have.  As our investment in tools, technology and platforms continues, we are keenly aware users want their design and verification data to be as portable as possible.  The Accellera user community members echoed the need to discuss portable stimulus that can take you up and down the design hierarchy from block, to subsystem, to system (“top”) and support the concurrent design of hardware and software.

In support of this, Accellera approved the formation of a Portable Stimulus Specification Proposed Working Group (PWG) to study the validity and need for a portable stimulus specification.  To that end, join me at the kickoff meeting to launch this activity on Wednesday, May 7, 2014 from 10:00am to 4:00pm Pacific time at the offices of Mentor Graphics in Fremont, CA USA.  If you would to attend, or you would  like time on the agenda to discuss technology that would advance the development of a Portable Stimulus Specification or discuss your objectives/requirements for this group, contact me and I will put you in touch with the meeting organizer.  Accellera PWG meetings are open to all and do not require Accellera membership status to attend.

, , , , , , , , , ,

17 March, 2014

As some of you might be aware, the Verification Academy has a video course dedicated to the topic of Assertion-Based Verification (ABV). In fact, this was the first video course we released for the academy almost six years ago. To date, it has remained one of our most popular courses. Yet one of the most common requests for improvement to this course has been adding content that focuses more on the verification process and provide guidelines on how to effectively integrate ABV into an existing flow. At the Verification Academy, we always welcome your feedback on content improvement. But before I talk about our new, updated ABV course, let me explain to you why I’m such a big believer in this technology.

Ensuring functional correctness continues to pose one of the greatest challenges for today’s SoC design teams. Indeed, recent industry studies have found that more time is spent in functional verification than any other project task. [1] But if you dig a little deeper into the trends revealed by these studies, you will find that debugging is the fastest-growing component of the overall verification process, and that on average, it consumes 34 percent of the verification engineer’s total effort. To make matters worse, the industry is witnessing increasing pressure to shorten the overall development cycle. Clearly, new design and verification techniques that improve productivity—combined with a focus on maturing functional verification process capabilities within an organization—are required.

Fig-6-2

Assertion-based verification (ABV), although certainly not an end-all to the verification challenge, directly addresses the debugging challenge. For example, those that have effectively adopted an assertion-based verification (ABV) methodology have seen a significant reduction in simulation debugging time (as much as 50 percent [2][3]) due to improved observability. In addition, organizations that have embraced an ABV methodology are able to take advantage of more advanced verification techniques, such as formal property checking, thus improving their overall verification quality and results.

While the process of writing assertions is fairly well understood by those skilled in the art—or whose skill can be easily acquired through a wealth of published papers and books that focus on language syntax and semantics—the process of creating a repeatable ABV methodology that integrates into an existing verification flow is not. And this is where our new ABV course will help. We’ve added a new session titled “Maturing a Project’s ABV Process Capabilities” that provides a set of actionable guidelines and recommendations that are process focused. These recommendations are based on years of ABV experience from multiple projects in various market segments, and they should help you achieve effective adoption of ABV on your project.

To learn more about the new ABV course, visit the Verification Academy (www.verificationacademy.com).

References

[1]     Prologue: The 2012 Wilson Research Group Functional Verification Study (here)

[2]     Y. Abarbanel, I. Beer, L. Gluhovsky, S. Keidar, Y. Wolfsthal, “FoCs—Automatic Generation of Simulation Checkers from Formal Specifications,” Proc. 12th International Conference Computer Aided Verification, pp. 414-427 (2000)

[3]     B. Turumella, M. Sharma, M., “Assertion-based verification of a 32 thread SPARC™ CMT microprocessor,” In Proceedings of the 45th Design Automation Conference, DAC 2008, pp. 256 – 261, (2008)

3 March, 2014

DVCon is always one of my favorite events in our industry, and I am proud to let you know that the latest issue of Verification Horizons is available “hot off the presses” at the Verification Academy to mark the occasion. For those of you attending the conference, please consider this issue as an addendum to the great technical program being offered (especially paper 8.1, “Of Camels and Committees: Standards Should Enable Innovation, Not Strangle It” by Dave Rich and yours truly). For those of you not able to join us at DVCon this year, consider this your consolation prize.

Although fewer in number, I’m sure you’ll find the articles in Verification Horizons as informational and useful as any you’ll see at DVCon. In particular, I’d like to make sure you check out these articles by our partners:

  • “Don’t Forget the Little Things That Can Make Verification Easier” by our friend Stu Sutherland of Sutherland HDL
  • “Taming Power-Aware Bugs with Questa Ultra” by SmartPlay Technologies
  • “Using Mentor Questa for pre-silicon validation of IEEE 1149.1-2013 based Silicon Instruments” by Intellitech
  • “Dealing With UVM and OVM Sequences” by eInfochips

If you’re at DVCon, please make sure to stop by the Mentor Graphics booth (#501) to say hi. Please join us on Wednesday for our luncheon presentation at noon, right after Session 8, in which I’ll present my paper mentioned above (that’s right. I’m not above shameless self-promotion). And we’ll wrap up the week with two Mentor-sponsored tutorials on Thursday:

Both of these tutorials feature a mix of Mentor presenters and customers to offer some practical examples that will give you some new ideas for improving your verification process. I hope to see you at DVCon.

, , ,

27 February, 2014

DVCon 2014 LogoPsst!  I’ll let you in on some news…

While DVCon calls the free portion of the conference “Exhibits Only,” let me share a little secret for you – You also gain access to the conference panels and the keynote presentation.

For those in Silicon Valley and local to DVCon, I invite you to register for the FREE side of the conference, not just for the conference exhibition that will have (in evening hours) drinks and appetizers, but for the industry conversation that will be offered via panels and CEO keynote.  The two panels will also feature Mentor Graphics speakers so you can learn our opinions on the topics as well.

How do you secure your FREE pass?  That’s the simple part!  Go here and start the registration process by clicking the “REGISTER NOW” button in the upper right.  After entering your contact information and completing a brief survey, you will be asked to select the part of the conference you wish to attend.  Select “Exhibit Only” for no charge.  Then “checkout” to complete your registration and you are done!  Of course, you can just show up and do this onsite.  But why waste time in line when you can do this from your computer or mobile device?

See you there!  You can find us at our Mentor Graphics booth.  We are booth 501.  (P.S., if you cannot spare the time to attend but would like to see a running commentary on the sessions, panels and other happenings follow me on Twitter: @dennisbrophy or look for the conference hashtag #DVCon.)

Now here is what you can get for free:

Panels

Is Software the Missing Piece In Verification?

Moderator Ed Sperling – Semiconductor Engineering
Panelists Tom Anderson – Breker
Kenneth Knowlson – Intel
Steve Chappell – Synopsys
Sandeep Pendharkar – Vayavya Labs
Frank Schirrmeister – Cadence Design Systems
Mark Olen – Mentor Graphics
Location Oak Ballroom
Date & Time Wednesday – 5 March 2014 8:30am – 9:45am

Did We Create the Verification Gap?

Moderator John Blyler – Extension Media
Panelists Janick Bergeron – Synopsys
Jim Caravella – NXP
Harry Foster – Mentor Graphics
John Goodenough – ARM
Bill Grundmann – Xilinx
Mike Stellfox – Cadence Design Systems
Location Oak Ballroom
Date & Time Wednesday – 5 March 2014 1:30pm – 3:00pm

Keynote

An Executive View of Trends and Technologies in Electronics
Lip-Bu Tan, President & CEO Cadence Design Systems
Oak Ballroom
Tuesday – 4 March 2014 2:00pm – 2:30pm

Exhibition

Monday: 5:00pm – 7:00pm (Booth Crawl included; Attendees open to win $500 gift card!)
Tuesday: 2:30pm – 6:00pm (Reception 5:00pm – 6:00pm)
Wednesday: 2:00pm – 6:00pm (Reception 5:00pm – 6:00pm)

, , ,

25 February, 2014

As DVCon expands, we at Mentor Graphics have grown our sponsored sessions as well.  Would you expect less?

In DVCon’s recent past, it was a tradition for the North American SystemC User Group (NASCUG) to sponsor a day of activity before the official start of the conference.  When OSCI merged with Accellera, the day before the official conference start grew to become Accellera Day with a broader set of meetings and activities covering many of Accellera’s standards.  This has all grown into a more official part of the DVCon program.  On Monday at DVCon – or as many still call it – Accellera Day – the tradeshow now joins in opening.  I covered this in detail in an earlier blog, so I won’t repeat myself now.

The pre-conference education and meet-up to discuss the latest in standards development is joined by an end of conference tutorial series that has expanded to allow four parallel sessions from three.  Instead of the one tutorial we at Mentor Graphics would otherwise sponsor at DVCon, we will offer two in this expanded series. Given the impact verification has on design it would seem right that more time be devoted to topics that address this.  One half-day tutorial is just to short to give the subject its due respect.

The two Mentor Graphics sponsored tutorials at DVCon, to be run in series, will devote a day to explore the application of current verification technology by us and users like you.  If you are already attending DVCon, you are making your tutorial selections now.  And for those who might only be interested to attend the tutorials themselves, DVCon offers a tutorials-only package ($145/Tutorial).  Mentor’s two tutorials are:

The first tutorial references “smooth sailing,” not because this will be a “no-pirate zone,” although I can tell you that since International Talk Like a Pirate Day is in late September, one won’t have to worry about a morning of pirate talk! [Interesting Fun Fact: Mentor Graphics’ headquarters in Wilsonville, OR USA is a short 50 miles (~80 km) north of the creators of this parotic holiday.]  The smooth sailing comes from the ability to easily use multiple engines from simulation, formal, emulation, FPGA prototyping to address your block to system-level verification needs.

The second tutorial is all about formal.  Or, in a more colloquial way to say it, we will answer the question: Whatsup with formal?  No, I doubt we will find more slang terms for formal technology being used and created in the tutorial.  But the tutorial will certainly look at more focused applications of formal technology.  As a pioneer in focused formal applications (like clock domain crossing) the creation of these focused formal applications has greatly simplified use and expanded technology access to verification teams with RTL design checks, X-state verification, and more joining the list.  Maybe we should ask Whatsapp with formal! But wait!  That slang question is already taken – and Facebook affirmed ownership with a $19B purchase of it recently.  Oh well, I lament.  Join me at this tutorial and we can explore something suitable and not yet taken as a replacement.  I can’t think of a better way to close DVCon than to see if we can invent another $19B term (or app).

, , , , , ,

23 February, 2014

UVM 1.2 Release is Imminent

As vice chair of DVCon 2014, I can share with you that the Universal Verification Methodology (UVM) remains a topic of great interest.  It sets the pace for tutorials and given the pending release by Accellera, learning what is new in UVM 1.2 is a compelling reason to attend DVCon.

The Accellera Day tutorial series on Monday at DVCon is popular with UVM being a session of great interest.  Aside from the “verification crisis” driving the need to explore this industry standard, the first major update is also a reason to generate this interest.  The UVM tutorial is meant for the novice and expert alike.  UVM experts can expect to walk away with more information on the new UVM 1.2 features and how they might plan to deploy them.

Naturally, I suggest you consider registering for the conference to attend this tutorial.  (There are still a few seats left; but you will need to hurry!)

UVM Working Group Discussions

As a member of the Accellera UVM Working Group, I have asked the team to consider adopting the SystemC development scheme of an open public review of a pending release of open source code.  While the merger of OSCI and Accellera to form Accellera Systems Initiative inherited the OSCI style of public review, Accellera has not fully embraced it for all its projects.

In a disclosure of a bit of insider conversation I had with the UVM WG this last week, I asked the group to confirm that we were going to bypass the “official” public review option and go to an internal 30-day review cycle only – then release to the public.  While the conclusion was to stay on the 30-day internal review path, the group also noted that one who may be familiar with Git might be able to locate the source code (and many have) and do testing.

Since the bleeding-edge users know they can access as it is being developed, why not share the Git commands for everyone to gain access?  So the group has done just this.  When last minute changes for Release Candidate 4 were put in place, the Git script to offer access for early review was shared publicly.  You can find can find this public message here, thanks to UVM WG member Adiel Khan (from Synopsys).

If you are a seasoned UVM user and are attending DVCon the week of March 3rd, I would encourage you to do some testing now so you can connect with the developers first hand.  And even if you are not attending DVCon but want to migrate to UVM 1.2, you might want to get an early start to determine what you might need to do to adopt this release.

If you are not going to attend the DVCon UVM tutorial and want a short update on what this version will offer, the UVM WG secretary, Adam Sherer (from Cadence), put together a brief slide set that he presented at the TVS DVClub event in September 2013 that you can download.  You may find it a useful companion to the download of the open source code.

Even if you are not attending DVCon, the adoption of UVM is globally substantial and it might be good to reflect on the need for broader testing.  In the first releases of UVM, this may not have been as important as few were using it and the number of tests limited to the main developers.  However, as its popularity has grown and adoption increased, it is probably a good idea for the Accellera UVM Working Group to consider the impact of a new release on teams actively using it now.  While the UVM WG drives to closure on its release candidate and the UVM 1.2 standard, you are offered the opportunity to give us feedback.  For those who have time, please do!

Mentor Commentary on Standards Development

Lastly, for those attending DVCon, check out our own Tom Fitzpatrick’s Wednesday morning paper – Of Camels and Committees: Standards Should Enable Innovation, Not Strangle It. His commentary on the development process may shed some additional light into how technology additions, changes and enhancements are judged for inclusion in updates to standards, like UVM.

Resources:
- UVM 1.2 New Feature Presentation (Sept 2013): Download Here (Free)
- UVM 1.2 Public Review Instructions (Feb 2014): Download Here (Free)
- Mentor Commentary at DVCon: Register Here ($)

, , , , , ,

11 February, 2014

DVCon 2014 LogoOne of the nice things about DVCon is the update one can get from the developers of IEEE and Accellera standards.  And this year’s DVCon is no exception.  The four days of DVCon begin and end with tutorials that cover updates to popular standards like UVM, UPF, SystemC and more.  For our part, Mentor Graphics is participating in the development and delivery of these updates with our peers.

UVM LogoI have written in the past about the productivity challenges before us to address the verification crisis and the emergence of machine-to-machine communication and the Internet of Things driving power aware design and verification.  To advance the demands on improved verification and help to address the verification crisis, the next round in the Universal Verification Methodology (UVM) standard is being readied for industry adoption.  UVM 1.2, the emerging update will be covered in some detail in a Monday morning tutorial to help you learn “What’s Now and What’s Next.”  Mentor Graphics’ Tom Fitzpatrick and Accellera Working Group representative will present in this tutorial.

UVM 1.2 is an active development project of Accellera and has not yet been released so there is no official standard available for download and use yet.  I’ll share standardization details as they happen.

At the same time on Monday, those who are concerned with power aware design and verification can attend the tutorial on the Unified Low Power Format (UPF), or as it is officially called IEEE 1801™-2013.  The tutorial will cover the full spectrum of UPF capabilities and methodology from basic to advanced applications.  So if you are new to UPF and want to learn, this is a great tutorial to attend.  And if you are already an expert, the advanced application of UPF as highlighted by those companies who have adopted UPF make this valuable for you as well.  Mentor Graphics’ Erich Marschner and IEEE 1801 Working Group vice-chair will participate in this tutorial.

UPF is an official IEEE standard.  Have you downloaded your copy yet?  Accellera has worked with the IEEE to make no-charge access to the official standard for you.  You can find the UPF standard here.

In the afternoon, there will be a session on case studies in SystemC.  User and vendor presentations will explore use of this standard.  SystemC offers much in the verification space, not just in technology but learning on how to bridge the RTL world with transaction level modeling world.  Mentor Graphics’ John Stickley will review what we have learned and how you can apply it to your most pressing verification needs.

SystemC is an official IEEE standard.  Have you downloaded your copy yet?  Under the Accellera agreement with the IEEE, you can download SystemC standard here.

There is a lot more to DVCon than just the use of current standards and planning adoption of emerging standards.  I encourage you to check out the whole agenda and join me at DVCon 2014 March 3-6.

Mentor Graphics presentations during the conference include:

  • Tuesday Paper Sessions
    • Amit Srivastava – Stepping Into UPF 2.1 World: Easy Solution to Complex
      Power Estimation
    • Kenneth Bakalar – Interpreting UPF For A Mixed-Signal Design Under Test
    • Gordon Allan – Tried and Tested Speedups for Software-Driven SoC Simulatio
  • Tuesday Poster Sessions
    • Rich Edelman – Debugging Communicating Systems: The Blame Game – Blurring
      the Line Between Performance Analysis and Debug
    • Matthew Balance – Tackling Random Blind Spots with Strategy-Driven Stimulus Generation
    • Gaurav K. Verma – Supercharge Your Verification Using Rapid Expression Coverage as the Basis of a MC/DC-Compliant Coverage Methodology
    • Andreas Meyer – So You Think You Have Good Stimulus: System-Level Distributed Metrics Analysis and Results
    • Rich Edelman – UVM SchmooVM – I Want My C Tests!
    • Thom Ellis – Are  You Really Confident That You Are Getting the Very Best From Your Verification Resources?
    • Jitesh Bansal – Is Your Power Aware Design Really X-Aware
  • Wednesday Paper Sessions
    • Avidan Efody – Wiretap Your SoC: Why Scattering Verification IPs Throughout Your Design Is A Smart Thing To Do
    • Tom Fitzpatrick – Of Camels and Committees: Standards Should Enable Innovation, Not Strangle It

Mentor Graphics will host its traditional lunch at DVCon on Wednesday on the theme of Accelerating Verification.  And we have lively panel participants for the Tuesday and Wednesday panels.  And, as always, the Exhibit, CEO Keynote and Panels are open to all a no charge – you just have to REGISTER!

I look forward to seeing you there!

, , , , ,

4 February, 2014

Marketing teams at FPGA vendors have been busy as the silicon nanometer geometry race escalates. Altera is “delivering the unimaginable” while Xilinx is offering “all programmable SoCs” to design centers. It’s clear that the SoC has become more accessible to a broader market today and that FPGA vendors have staked out a solid technology roadmap for the near future. Do marketing messages surrounding the geometry race effect day to day life of engineers, and if so, how – especially when it comes to verification?
An excellent whitepaper from Altera, “The Breakthrough Advantage for FPGAs with Tri-Gate Technology,” covers Altera’s Stratix 10 FPGAs and SoCs. The paper describes verification challenges in this new expanded market this way: “Although current generation FPGAs require a rigorous simulation verification methodology rivaling ASICs, the additional lab testing and ability to reprogram FPGAs save substantial manpower investment. The overall cost of ownership must be considered when comparing an FPGA whose component price is higher than an ASIC of similar complexity.” I believe you can use this statement to engage your management in a discussion about better verification processes.

Xilinx also has excellent published technical resources. Its recent UltraScale backgrounder describes how they are solving the challenges in implementing a design with their reprogrammable silicon. Clearly Xilinx has made an impressive investment to make it easier to implement a design with its FPGA UltraScale products. Improvements include ASIC-like clocking and annealing dataflow bottlenecks without compromising performance. Xilinx also describes improvements when using its Vivado design suite, particularly when it comes to in-lab design bring up.

For other FPGA insights, it’s also worth checking out Electronics Engineering Journal’s recent article “Proliferating Programmability in 2014,” which claims that the long-term future of FPGAs tool flows even though, as Kevin Morris sees it, EDA seems to have abandoned the market. (Kevin, I’m here to tell you you’re wrong.)

Do you think it’s inevitable that your FPGA team will first struggle to make it across the verification finish line before adopting a more process-oriented verification flow like the ASIC market demands? It’s not. I base this conclusion on the many conversations I’ve had with FPGA designers, their managers, sales engineers and many other talented people in this market over the years. Yes, there are significant challenges in FPGA design, but not all of them are technology related. With some emotion, one engineer remarked that debugging the same type of issue over and over in the hardware lab and expecting a different outcome was insane. (He’s right.) Others say they need specific ROI information for their management to even accept their need for change. Still others state that had they only known the solutions I talked about in my seminar a year ago, they would have not spent months and months bringing up their design in the lab.
With my peers here at Mentor Graphics, I have developed a three-step verification flow that includes coverage, assertions and improved throughput. I’ll write about this flow and related issues in the weeks ahead here on this blog. The flow is built on fundamental verification technologies that benefit the broad FPGA market. The goal, in developing the technology and writing about it here, has been to provide practical solutions and help more FPGA teams cross the verification gap.

In the meantime, what are your stories? Are you able to influence your management into adopting advanced technology to aid lab bring-up? Is your management’s bias towards lower cost and faster implementation (at the expense of verification)? Let me know in the comments or, if you prefer, by e-mail: joe_rodriguez@mentor.com.

, , , , , , , , ,

6 January, 2014

The UCIS Story

There is no secret as design sizes grow it is doubly burdensome for verification.  Two factors that are easy to measure is the time it takes to simulate a design and the other is the size of the dataset that contains the results of the verification runs. Simulation times are growing and the datasets are getting larger.  While time and attention is given to accelerated verification through emulation, or alternate verification methods, to reduce run times, less explored is the impact of larger datasets on verification closure.  How does one find bugs within datasets that are so large?  How can verification results from simulation, emulation, formal and more be brought together to help drive verification closure?  How can one link failures in verification back to requirements?

The Accellera standards organization took a multi-year journey to help address these issues and arrived at the creation of the Unified Coverage Interoperability Standard (UCIS).  You can get your free copy here if you would like to read and use it.  Mentor Graphics contributed a significant starting point to the standard and collaborated with major competitors and users to add to and extend from there.  But now that the standard is done, what does one do with it?

While that was a rhetorical question when the standard was done in 2012; today it begs an answer.

From my perspective there are two classes of users of UCIS.  The more immediate users are those who are building verification tools that must contend with design and verification complexity now.  With UCIS they have the initial underpinnings to add product features that will allow a level of data portability that was not present prior to the standard.  The second class of users are those who will use the UCIS Application Programming Interface (API) to build functions that will perform simple and complex tasks on these large datasets.  This last class of user that might exchange UCIS API code with each other has yet to materialize.  But the stage is set for them.

To highlight what the first class of UCIS adopters have been doing, DVClub in Europe will tackle to answer this question as on what one can do with UCIS on Monday, 13 January 2014.  Darron May, Product Manager at Mentor Graphics will speak for us on our application of the standard.  His session is titled Blending Metrics from Multiple Verification Engines to Improve Productivity.  You can find out more details about the DVClub event (speakers and presentation abstracts) and register here to attend in person or via remote access.  The event will be held 12:00-14:00 GMT and is free.

, , , , , ,

28 November, 2013

Wow! I’ve been on the road since August, and finally found a spare moment to get back to this blog. I started this blog series with a prologue that gave a little bit of background on the 2012 Wilson Research Group study. And with this epilogue, I will draw the series to a conclusion with some insight on the study process. Many people are cynics about industry studies in general (and particularly ones based on surveys) and believe that they are inaccurate, unreliable, and biased. However, I believe that the benefit from an industry study is not necessarily the quantitative values that the answers reveal, but the new questions they raise.

With that said, it is important to understand that the 2007 FarWest Research study and the 2010 and 2012 Wilson Research Group studies followed the format of the original 2002 and 2004 Ron Collett International studies, which have certain limitations. For example, the Collett studies were very block-, IP-, and RTL-focused studies. The data that these studies revealed certainly is of value and important, but it doesn’t represent some of the challenges in SoC integration verification and system validation that have recently emerged. In fact, many of the techniques used for block and subsystem verification that these surveys studied (such as constrained-random, functional coverage, general formal property checking) do not scale well to the SoC integration and system-level validation space. I believe that future studies should be expanded to include these emerging challenges.

Another criticism of all the previous studies is that the presented data is aggregated across all market segments—ranging from mil/areo, mobile, consumer, networking, computers, and so forth. Hence, it can be difficult for those in a particular market segment to benchmark themselves against or relate to the study data. This is a valid argument. With our two recent Wilson Research studies, it is possible to filter the data down for presentation to a specific market segment. However, it is not possible with the previous studies.

Nonetheless, there is still value in observing general industry trends between the multiple studies as long as the format of the studies is consistent. Consistency is something we strived for across the studies. For example, whenever possible, we tried to maintain the exact wording of the questions originally used in the Collett studies.

Minimizing Study Biases

When architecting a study, there are three main concerns that must be addressed to ensure valid results: (1) sample validity, (2) non-response bias, (3) stakeholder bias. I’ll briefly review each of these concerns in the following paragraphs, and discuss steps we took to try and minimize the bias concerns.

(1)    Sample validity or undercoverage bias:  To ensure that a study is unbiased, it’s critical that every member of a studied population have an equal chance of participating. An example of a biased study would be when a technical conference surveys its participants. The data might raise some interesting questions, but unfortunately, it doesn’t represent members of the population that was  unable to participant in the conference. They same bias can occur if a journal or online publication simply surveys its subscribers.

A classic example of this problem is the famous Literary Digest poll in the 1936 presidential election, where the magazine surveyed over two million people. This was a huge study for this period in time. The pool (or make-up) of the study was chosen from the magazine’s subscriber list, phone books, and car registrations. However, the problem with this approach was that the study did not represent the actual voter population since it was a luxury to have a subscription to a magazine, or a phone, or a car during The Great Depression. As a result of this biased sample, the poll inaccurately predicted that Republican Alf Landon versus the Democrat Franklin Roosevelt would win the 1936 presidential election.

For the 2012 Wilson Research Group study, we carefully chose a broad set of lists that, when combined, represented all regions of the world and all electronic design market segments. We reviewed the participant results in terms of market segments to ensure no segment or region representation was inadvertently excluded or under-represented.

(2)    Non-response bias: Non-response bias in surveys occurs when a randomly sampled individual cannot be contacted or refuses to participate in a survey. For example, spam and unsolicited mail filters remove an individual from the possibility of receiving an invitation to participate in a survey, which can bias results. It is important to validate sufficient responses occurred across all list that make up the study pool. Hence, we reviewed the final results to ensure that no single list of respondents that made up the participant pool dominated the final results.

Another potential non-response bias is due to lack of language translation. In fact, we learned this during the 2010 Wilson Research Group study. The study generally had good representation from all regions of the world, with the exception of an initially very poor level of participation from Japan. To solve this problem, we took two actions: (1) we translated both the invitation and the survey into Japanese, (2) we acquired additional engineering lists directly from Japan to augment our existing survey invitation list. This resulted in a balanced representation from Japan. Based on that experience, we took the same approach to solve the language problem for the 2012 study.

(3)    Stakeholder bias: Stakeholder bias occurs when someone who has a vested interest in survey results can complete an online survey multiple times and urge others to complete the survey in order to influence the results. To address this problem, a special code was generated for each study participation invitation that was sent out. The code could only be used once to fill out the survey, preventing someone from taking the study multiple times, or sharing the invitation with someone else.

2010 Study Bias

After analyzing the results from the 2012 study we are confident that the study was balanced across market segments and regions of the world, which was our goal. However, while architecting the 2012 study, we did discover a non-response bias associated with the 2010 study. Although multiple lists across multiple market segments and across multiple regions of the world were used during the 2010 study, we discovered that a single list dominated the responses, which consisted of participants who worked on more advanced projects and whose functional verification processes tend to be mature. For example, as a result of this bias, if you look at Figure 3 concerning languages used to create testbenches, the industry-wide adoption of SystemVerilog is likely less than what was shown for 2010. This means that the industry adoption between 2010 and 2012 would have increased slightly more than indicated, and the growth between 2007 and 2010 would have been slightly less.

The 2007 study, like the 2012 study, was well balance and did not exhibit the non-response bias previously described for the 2010 data. Hence, we have confidence in talking about general industry trends between 2007 and 2012. The 2010 data can still be useful as a reference point during discussion if you keep in mind that it represents a more process-mature segment of the population.

Our plan is to commission a new study in 2014. At that point, we should have sufficient data to start showing trends in the FPGA space (something we have not been able to do yet) and have a clearer picture of emerging trends in the non-FPGA space. However, as previously stated, the emerging challenges today are occurring in the SoC integration verification and system-level validation space. We will either reduce the scope of our existing IP/RTL-focused study to make room for new questions in these spaces, or conduct a separate, rigorous study for these new emerging challenges.

Final word—I hope the data presented in this set of blogs has provided some insight on general trends. But more importantly, I hope it has inspired you with questions about your own processes that you might want to investigate.

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@HarryAtMentor Tweets

  • Loading tweets...

Recent Comments