Posts Tagged ‘Assertion-Based Verification’

26 August, 2013

Verification Techniques & Technologies Adoption Trends (Continued)

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 10 click here), I presented verification techniques and technologies adoption trends, as identified by the 2012 Wilson Research Group study. In this blog, I continue those discussions and focus on formal verification, acceleration/emulation, and FPGA prototyping.

For years, the term “formal verification” has bugged me since it is quite often misunderstood in the industry. The problem originated back in the mid 1990’s with the emergence of formal equivalence checking tools from various EDA vendors, such as Chrysalis Symbolic Design. These tools were introduced to the market as formal verification, which is technically a true statement. However, there are a range of tools available under the category formal verification, such as formal property checkers and equivalence checkers.

So, what’s the problem? The question related to formal property checking in prior studies could have been misinterpreted by some participants to mean equivalence checking, which reduces the confidence in the results. To prevent this misinterpretation, we decided to change the question in 2012 to clarify that we were talking about the formal verification of assertions and clearly state “not equivalence checking” in the question.

One other thing we wanted to learn in the formal verification space during this study was what percentage of the market was using these auto-formal analysis tools (such as X safety checks, deadlock detection, reset analysis, etc.) versus formal property checking tools. The previous studies never made this distinction.

The fact that we changed the question related to formal property checking while adding in auto-formal in the 2012 study means that there is no meaningful way to compare this study’s formal verification results to the formal verification results from prior studies.

Formal Technology Adoption Trends

Figure 1 shows the adoption percentages for formal property checking and auto-formal techniques.

Figure 1. Formal Technology Adoption

We found that about five percent of the participants who are applying auto-formal techniques are not doing formal property checking. This means that the combined adoption of formal property checking and auto-formal techniques is about 32 percent. As a point of reference, the 2007 FarWest Research study found 19 percent adoption for formal verification—and the 2010 study found the adoption at 29 percent. Both the 2007 and 2010 studies included the potential erroneous responses associated with formal equivalence checking, as well as auto-formal usage.

Figure 2 provides a different analysis of the formal property adoption data by partitioning the results by design sizes. The design size partitions are represented as: less than 5M gates, 5M to 20M gates, and greater than 20M gates.

Figure 2. Formal property checking adoption by design size

Acceleration/Emulation & FPGA Prototyping Adoption Trends

The amount of time spent in a simulation regression is an increasing concern for many projects. Intuitively, we tend to think that the design size influences simulation performance. However, there are two equally important factors that must be considered: number of tests in the simulation regression suite and the length of each test in terms of clock cycles.

For example, a project might have a small or moderate-sized design, yet verification of this design requires a long running test (e.g., a video input stream). Hence, in this example, the simulation regression time is influenced by the number of clock cycles required for the test and not necessarily the design size itself.

Figure 3 shows the number of directed tests created to verify a design in simulation (i.e., the regression suite). The findings obviously varied dramatically from a handful of tests to thousands of tests in a regression suite, depending on the design.

Figure 3. Number directed test created to verify a design

The increase in tests in the range of 1-100 is interesting to note. Is this due to the increase in adoption of constrained-random verification techniques in the past few years? Or possibly, something else is going on here. This line of questioning illustrates the value of reviewing various industry studies. That is, it is not so much in the absolute values a study presents, but the questions the new data raises.

Next, let’s look at regression times as shown in Figure 5. As you can see, it also varies dramatically from short regression times for some projects to multiple days for other projects. The median simulation regression time is about 16-24 hours. Here, we also see an increase in shorter regression times. Again this data raises some interesting questions that are worth exploring.

Figure 4. Simulation regression time trends

One technique that is often used to speed up simulation regressions (either due to very long tests and lots of tests) is either hardware-assisted acceleration or emulation. In addition, FPGA prototyping, while historically used as a platform for software development, has recently served a role in SoC integration validation.

Figure 5 shows the adoption trend for both HW-assisted acceleration/emulation and FPGA prototyping by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green). We see a continual rise in HW acceleration and emulation. This is not only due to the need to verify larger designs, or designs with long test times. HW acceleration and emulation has become the key platform for SoC Integration verification, where both hardware and software are integrated into a system for the first time. In addition, emulation is being used increasingly as a software development platform.

Figure 5. HW-assisted acceleration/emulation and FPGA Prototyping trends

Note that the adoption of FPGA prototyping has remained flat (or decreased slightly as the 2012 data suggest). This might seem counter-intuitive since we previously saw a trend in terms of the increase in SoC class designs. So what’s going on?

Figure 6 partitions the data for HW-assisted acceleration/emulation and FPGA prototyping adoption by design size: less than 1M gates, 1M to 20M gates, and greater than 20M gates. Notice that the adoption of HW-assisted acceleration/emulation continues to increase as design sizes increase. However, the adoption of FPGA prototyping rapidly drops off as design sizes increase beyond 20M gates. 

Figure 6. Acceleration/emulation and FPGA prototyping adoption by design size

This graph illustrates one of the problems with FPGA prototyping of very large designs, which is that there is an increased engineering effort required to partition designs across multiple FPGAs. In fact, what I have found is that FPGA prototyping of very large designs is often a major engineering effort in itself, and that many projects are seeking alternative solutions to address this problem.

In my next blog (click here), I will present the final data I plan to share from the Wilson Research Group study. This blog will focus on results in terms of meeting schedules, required spins, and classes of bugs contributing to respins. I will then wrap up this series of blogs in what I call the Epilogue—which will discuss potential gotchas and cautions on interpreting certain aspects of the data and thoughts about how the data could be used constructively.

, , , , , ,

19 August, 2013

Verification Techniques & Technologies Adoption Trends

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for background on the study, click here).

In my previous blog (Part 9 click here), I focused on some of the 2012 Wilson Research Group findings related to design and verification language and library trends. In this blog, I present verification techniques and technologies adoption trends, as identified by the 2012 Wilson Research Group study.

An interesting trend we are starting to see is that the electronic industry is maturing its functional verification processes, whether they are targeting their designs at IC/ASIC or FPGA implementations. This blog provides data to support this claim. An interesting question you might ask is, “What is driving this trend?” In some of my earlier blogs (click here for Part 1 and Part 2) I showed an that design complexity is increasing in terms design sizes and number of embedded processors. In addition, I’ve presented trend data that showed an increase in total project time and effort spent in verification (click here for Part 5 and Part 6). My belief is that the industry is being forced to mature its functional verification processes to address increasing complexity and effort.

Simulation Techniques Adoption Trends

Let’s begin by comparing  non-FPGA adoption trends related to various simulation techniques from the 2007 Far West Research study  (in blue) with the 2012 Wilson Research Group study  (in green), as shown in Figure 1.

Figure 1. Simulation-based technique adoption trends for non-FPGA designs

You can see that the study finds the industry increasing its adoption of various functional verification techniques for non-FPGA targeted designs. Clearly the industry is maturing its processes as I previously claimed.

For example, in 2007, the Far West Research Group found that only 48 percent of the industry performed code coverage. This surprised me. After all, HDL-based code coverage is a technology that has been around since the early 1990’s. However, I did informally verify the 2007 results through numerous customer visits and discussions. In 2012, we see that the industry adoption of code coverage has increased to 70 percent.

In 2007, the Far West Research Group study found that 37 percent of the industry had adopted assertions for use in simulation. In 2012, we find that industry adoption of assertions had increased to 63 percent. I believe that the maturing of the various assertion language standards has contributed to this increased adoption.

In 2007, the Far West Research Group study found that 40 percent of the industry had adopted functional coverage for use in simulation. In 2010, the industry adoption of functional coverage had increased to 66 percent. Part of this increase in functional coverage adoption has been driven by the increased adoption of constrained-random simulation, since you really can’t effectively do constrained-random simulation without doing functional coverage.

Now let’s look at  FPGA adoption trends related to various simulation techniques from the 2010 Far West Research study  (in pink) with the 2012 Wilson Research Group study  (in red).

Figure 2. Simulation-based technique adoption trends for non-FPGA designs

Again, you can clearly see that the industry is increasing its adoption of various functional verification techniques for FPGA targeted designs. This past year I have spent a significant amount of time in discussions with FPGA project managers around the world. During these discussions, most mangers mention the drive to improve verification process within their projects due to  rising complexity of this class of designs. The Wilson Research Group data supports these claims.

In fact, Figure 3 illustrates this maturing trend in the FPGA space, where we saw a 15 percent increase in the adoption of RTL simulation and an 8.5 percent increase in the adoption of code coverage. For complex FPGA designs, the traditional approach of “burn and churn” and debug in the lab is no longer a viable option. Nonetheless, it is still somewhat alarming that 31 percent of the FPGA study participants work on projects that perform no RTL simulation.

Figure 3. FPGA projects maturing their verification processes

Signoff Criteria Trends

We saw earlier in this blog the increased adoption of coverage techniques in the industry. Coverage has become a major component of a project’s verification signoff criteria. In Figure 4, we see how coverage has increased in importance in verification signoff criteria within the past five years, while other decision attributes have declined in terms of importance.

Figure 4. Non-FPGA functional verification signoff criteria trends

We see the same trends for FPGA designs, as shown in Figure 5.

Figure 5. FPGA functional verification signoff criteria trends

In my next blog (click here), I plan to continue the discussion related to adoption of various verification technologies and techniques as identified by the 2012 Wilson Research Group study.

, , , , , , , , , ,

12 August, 2013

Language and Library Trends (Continued)

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 8 click here), I focused on design and verification language trends, as identified by the Wilson Research Group study. This blog presents additional trends related to verification language and library adoption trends.

You might note that for some of the language and library data I present, the percentage sums to more than 100 percent. The reason for this is that some participants’ projects use multiple languages or multiple testbench methodologies.

Testbench Methodology Class Library Adoption

Now let’s look at testbench methodology and class library adoption for IC/ASIC designs. Figure 1 shows the trends in terms of methodology and class library adoption by comparing the 2010 Wilson Research Group study (in blue) with the 2012 study (in green). Today, we see a downward trend in terms of adoption of all testbench methodologies and class libraries with the exception of UVM, which has increased by 486 percent since the fall of 2010. The study participants were also asked what they plan to use within the next 12 months, and based on the responses, UVM is projected to increase an additional 46 percent.

Figure 1. Methodology and class library trends

Figure 2 show the adoption of testbench methodologies and class libraries for FPGA designs (in red). We do not have sufficient data to show prior adoption trends in the FPGA space, but we anticipate that our future studies will enable us to do this. However, we did ask the FPGA study participants which testbench methodologies and class libraries they were planning to adopt within the next 12 months. Based on these responses, we anticipate that UVM adoption will increase by 40 percent, and OVM increase by 24 percent in the FPGA space.

Figure 2. Methodology and class library trends

Assertion Languages and Libraries

Finally, let’s examine assertion language and library adoption for IC/ASIC designs. The Wilson Research Group study found that 63 percent of all the IC/ASIC participants have adopted assertion-based verification (ABV) as part of their verification strategy. The data presented in this section shows the assertion language and library adoption trend related to those participants who have adopted ABV.

Figure 3 shows the trends in terms of assertion language and library adoption by comparing the 2010 Wilson Research Group study (in blue), the 2012 Wilson Research Group study (in green), and the projected adoption trends within the next 12 months (in purple). The adoption of SVA continues to increase, while other assertion languages and libraries either remain flat or decline.

Figure 3. Assertion language and library adoption for Non-FPGA designs

Figure 4 shows the adoption of assertion language trends for FPGA designs (in red). Again, we do not have sufficient data to show prior adoption trends in the FPGA space, but we anticipate that our future studies will enable us to do this. We did ask the FPGA study participants which assertion languages and libraries they planned to adopt within the next 12 months. Based on these responses, we anticipate an increase in adoption for OVL, SVA, and PSL in the FPGA space within the next 12 months.

Figure 4. Trends in assertion language and library adoption for FPGA designs

In my next blog (click here), I plan to focus on the adoption of various verification technologies and techniques used in the industry, as identified by the 2012 Wilson Research Group study.

, , , , , , , , , ,

23 April, 2013

This is the first in a series of blogs that presents the results from the 2012 Wilson Research Group Functional Verification Study.

Study Overview

In 2002 and 2004, Ron Collett International, Inc. conducted its well known ASIC/IC functional verification studies, which provided invaluable insight into the state of the electronic industry and its trends in design and verification. However, after the 2004 study, no other industry studies were conducted, which left a void in identifying industry trends.

To address this void, Mentor Graphics commissioned Far West Research to conduct an industry study on functional verification in the fall of 2007. Then in the fall of 2010, Mentor commissioned Wilson Research Group to conduct another functional verification study. Both of these studies were conducted as blind studies to avoid influencing the results. This means that the survey participants did not know that the study was commissioned by Mentor Graphics. In addition, to support trend analysis on the data, both studies followed the same format and questions (when possible) as the original 2002 and 2004 Collett studies.

In the fall of 2012, Mentor Graphics commissioned Wilson Research Group again to conduct a new functional verification study. This study was also a blind study and follows the same format as the Collett, Far West Research, and previous Wilson Research Group studies. The 2012 Wilson Research Group study is one of the largest functional verification studies ever conducted. The overall confidence level of the study was calculated to be 95% with a margin of error of 4.05%.

Unlike the previous Collett and Far West Research studies that were conducted only in North America, both the 2010 and 2012 Wilson Research Group studies were worldwide studies. The regions targeted were:

  • North America:Canada,United States
  • Europe/Israel:Finland,France,Germany,Israel,Italy,Sweden,UK
  • Asia (minusIndia):China,Korea,Japan,Taiwan
  • India

The survey results are compiled both globally and regionally for analysis.

Another difference between the Wilson Research Group and previous industry studies is that both of the Wilson Research Group studies also included FPGA projects. Hence for the first time, we are able to present some emerging trends in the FPGA functional verification space.

Figure 1 shows the percentage makeup of survey participants by their job description. The red bars represents the FPGA participants while the green bars represent the non-FPGA (i.e., IC/ASIC) participants.

 

Figure 1: Survey participants job title description

Figure 2 shows the percentage makeup of survey participants by company type. Again, the red bars represents the FPGA participants while the green bars represents the non-FPGA (i.e., IC/ASIC) participants.

Figure 2: Survey participants company description

In a future set of blogs, over the course of the next few months, I plan to present the highlights from the 2012 Wilson Research Group study along with my analysis, comments, and obviously, opinions. A few interesting observations emerged from the study, which include:

  1. FPGA projects are beginning to adopt advanced verification techniques due to increased design complexity.
  2. The effort spent on verification is increasing.
  3. The industry is converging on common processes driven by maturing industry standards.

A few final comments concerning the 2012 Wilson Research Group Study.  As I mentioned, the study was based on the original 2002 and 2004 Collett studies.  To ensure consistency in terms of proper interpretation (or potential error related to mis-interpretation of the questions), we have avoided changing or modifying the questions over the years—with the exception of questions that relate to shrinking geometries sizes and gate counts. One other exception relates  introducing a few new questions related to verification techniques that were not a major concern ten years ago (such as low-power functional verification).  Ensuring consistency in the line of questioning enables us to have high confidence in the trends that emerge over the years.

Also, the method in which the study pools was created follows the same process as the original Collett studies.  It is important to note that the data presented in this series of blogs does not represent trends related to silicon volume (that is, a few projects could dominate in terms of the volume of manufactured silicon and not represent the broader industry).  The data in this series of blogs represents trends related to the study pool—which is a fair proxy for active design projects.

My next blog presents current design trends that were identified by the survey. This will be followed by a set of blogs focused on the functional verification results.

Also, to learn more about the 2012 Wilson Reserach Group study, view my pre-recorded Functional Verification Study web-seminar, which is located out on the Verification Academy website.

Quick links to the 2012 Wilson Research Group Study results (so far…)

, , , , , , , , , , , , ,

27 July, 2012

At the 2012 Design Automation Conference, I had the pleasure of moderating a panel at a workshop titled “Post-Silicon Debug: Technologies, Methodologies, and Best-Practices.” This workshop brought together a collection of experts from industry, academia, and EDA to discuss the emerging challenges and solutions associated with post-silicon validation. The speakers presented different instrumentation strategies, as well as methods of using data collected by the debug logic to facilitate fast and efficient debug.

Performing verification on real silicon introduces a number of new and unique challenges. On the one hand, real silicon offers great execution speed, which enables a long test run that reaches deep into the design’s state-space. On the other hand, real silicon lacks both good controllability and observability, which serve an important role in pre-silicon verification. Assertions, which have always been one of my passions, have been shown to address both the controllability and observability challenges associated pre-silicon verification (for example, RTL simulation). And now, there is emerging interest in addressing these same challenges in post-silicon validation.

I’d like to invite you to check out my Tech Design Forum article titled Synthesizing assertion into hardware for faster debug.   Obviously, synthesizing hardware assertions is only one of many new solutions that are currently being explored to contain the growing cost and effort associated with post-silicon debug. One attractive benefit of assertion-based techniques is that they provide a nice natural link between pre-silicon verification and post-silicon validation, in terms of reuse.

I’d like to hear your opinions concerning synthesizing hardware assertions, as well as post-silicon debugging challenges in general.

, ,

20 July, 2012

Live & In-Person at DAC 2012!

DAC 2012 4Verification Academy, the brain child of Harry Foster, Chief Verification Scientist at Mentor Graphics, was live from the Design Automation Conference tradeshow floor this year.  Harry is pictured to the right giving an update on his popular verification survey from the DAC tradeshow floor.

The Verification Academy, predominantly a web-based resource is a popular site for verification information with more than 11,000 registered members for forum access on topics ranging from OVM/UVM, SystemVerilog and Analog/Mixed-Signal design.  The popular OVM/UVM Cookbook, which used to be available as a print edition, is now a live online resource there as well.  A whole host of educational modules and seminars can also be found there too.

If you know about the Verification Academy, you know all about  the content mentioned above and that there is much more to be found there.  For those who don’t know as much about it, Harry took a break from the being at the Verification Academy booth at DAC to discuss the Verification Academy with Luke Collins, Technology Journalist, Tech Design Forum.  (Flash is required to watch Harry discuss Verification Academy with Luke.)

The Verification Academy at DAC was a great venue to connect in person with other Verification Academy users to discuss standards, methodologies, flows and other industry trends.  Each hour there were short presentations by Verification Academy members that proved to be a popular way to start some interesting conversations.  While we realize not all Verification Academy members were able to attend DAC in person, we know many have expressed an interest to some of the presentations.  Verification Academy “Total Access” members now have access to many of the presentations.

 

ARM

 

Doulos

 

Thales Alenia Space

 

Test & Verification Solutions

 

Willamette HDL

 

Sunburst Design

 

Mentor Graphics

Total Access members can also download all the presentations in a .zip file.  Happy reading to all those who were unable to visit us at DAC and thank you to all who were able to stop by and visit.

, , , , , , , , , , , , , , , , , , , ,

13 May, 2011

Language and Library Trends

This blog is a continuation of a series of blogs, which present the highlights from the 2010 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 7 click here), I focused on some of the 2010 Wilson Research Group findings related to testbench characteristics and simulation strategies. In this blog, I present design and verification language trends, as identified by the Wilson Research Group study.

You might note for some of the language and library data I present, the percentage sums to more than one hundred percent. The reason for this is that some perticipant’s projects use multiple languages and multiple methodologies.

Design Languages

Let’s begin by examining the languages used for design, as shown in Figure 1.  Here, we compare the results for languages used to design FPGAs (in grey) with languages used to design non-FPGAs (in green).

p8-slide1

Figure 1. Languages used for design

Not too surprising, we see that VHDL is the most popular language used for the design of FPGAs, while Verilog and SystemVerilog are the most popular languages used for the design of non-FPGAs.

Figure 2 shows the trends in terms of languages used for design, by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green), as well as the projected design language adoption trends within the next twelve months (in purple). Note that the design language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption is increasing.

p8-slide2

Figure 2. Trends in languages used for design

Verification Languages

Next, let’s look at the languages used for verification (that is, languages used to create simulation testbenches). Figure 3 compares the results between FPGA designs (in grey) and non-FPGA designs (in green). p8-slide3

Figure 3. Languages used in verification to create simulation testbenches

And again, it’s not too surprising to see that VHDL is the most popular language used to create verification testbenches for FPGAs, while SystemVerilog  is the most popular language used to create testbenches for non-FPGAs.

Figure 4 shows the trends in terms of languages used to create simulation testbenches by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green), as well as the projected language adoption trends within the next twelve months (in purple). Note that verification language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption is increasing.

p8-slide4

Figure 4. Trends in languages used in verification to create simulation testbenches

Now, let’s look at methodology and class library adoption. Figure 5 shows the future trends in terms of methodology and class library adoption by comparing the 2010 Wilson Research Group study (in green) with the projected adoption trends within the next twelve months (in purple). Previous studies did not include data on methodology and class library adoption, so we are unable to show previous trends.

survey-blog-part8

Figure 5. Methodology and class library future trends

The study indicates that the only methodology adoption projected to grow in the next twelve months are OVM and UVM. 

Assertion Languages and Libraries

Finally, let’s examine assertion language and library adoption, as shown in Figure 6.  Here, we compare the results for FPGA designs (in grey) and non-FPGA designs (in green).

p8-slide6

Figure 6. Assertion language and library adoption

SystemVerilog Assertions (SVA) is the most popular assertion language used for both FPGA and non-FPGA designs.

Figure 7 shows the trends in terms assertion language and library adoption by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green), as well as the projected adoption trends within the next twelve months (in purple). Note that the adoption of most of the assertion languages is declining, with the exception of SVA whose adoption is increasing.

p8-slide7

Figure 7. Trends in assertion language and library adoption

In my next blog (click here), I plan to focus on the adoption of various verification technologies and techniques used in the industry, as identified by the 2010 Wilson Research Group study.

, , , , , , , , , , , , , , ,

3 December, 2010

As the saying goes: Those who fail to plan, plan to fail. With that said, I am excited to announce a new module focused on Verification Planning, which has been one of the Verification Academy’s most-requested subjects for new content. The new Verification Planning module is delivered by our subject matter expert, who literally wrote the book on the subject, Peet James.

The goal of verification planning and management is to architect an overall verification approach, and then to document that approach in a family of useful, easily extracted, maintainable verification documents that will strategically guide the overall verification effort so that the most amount of verification is accomplished in the allotted time. The aim of this module is to define terms, logically divide up the verification effort, and lay the foundation for actual verification planning and management on a real project.

I think you will really enjoy and be enlightened by Peet’s treatment of the subject, and hopefully, you can apply many of the techniques that he presents to your own projects.

Speaking of applying Verification Academy techniques—we just conducted a large survey about the academy and found some interesting results that I would like to share with you. First, Figure 1 shows who is viewing the Verification Academy content by job title.

slide031

Figure 1: Verification Academy viewers by job title

It’s not too surprising that a majority of the viewers are verification engineers, with a ratio of about 3.5 verification engineers for every 2 designers.

In addition to who is viewing the Verification Academy, we were interested in learning the viewer’s type of targeted design implementation to get a better understanding of our viewers’ needs. Figure 2 shows who is viewing the Verification Academy by their type of targeted design implementation.

slide04

Figure 2: Verification Academy viewers by targeted design implementation

We are obviously seeing a growing number of FPGA engineers interested in advanced functional verification. Today’s complex SoC-base FPGA designs are not your mom and pop variety of FPGA designs. More advanced verification skills are required to ultimately meet both quality and schedule goals.

Another question we wanted to answer through our survey is whether the Verification Academy has been useful. One way to answer this is to see how many viewers had actually applied or plan to apply the knowledge they learned in the Verification Academy on their own projects. The survey results are shown in Figure 3.

slide07

Figure 3: Verification Academy viewers who have applied knowledge on projects

We also wanted to determine through the survey if the content presented in the Verification Academy was at an appropriate level of detail. The survey results are shown in Figure 4.

slide09

Finally, we wanted to determine through the survey which additional topic in advanced functional verification should be covered in the Verification Academy. Figure 5 presents the results.

slide13

Figure 5: Verification Academy new subject content request

Your feedback is important to us, and we are very excited that our new Verification Planning module was one of the top requests from the Verification Academy survey participants.

I would like to encourage you to check out all our new and existing content at the Verification Academy by visiting www.verificationacademy.com.

, , , , ,

26 July, 2010

For years one of the objectives in EDA has been to make formal property checking easy to use and its results easy to understand. With the Automatic formal check feature in the June release of the 0-In Formal tool version 3.0, I think we have made significant progress in this area.

The feature, which predefines a set of assertion rules to look for design issues automatically, makes formal technology accessible to users who are not yet ready to write properties in System Verilog Assertion (SVA) or Property Specification Language (PSL). To make it easier to comprehend problems in the design, the tool highlights the violations back to the RTL code.

Automatic formal check focuses on three areas inadequately addressed by dynamical simulation:

The first area is functional coverage. Today, when constrained random simulation fails to achieve the targeted coverage goal, engineers have to fine tune the environment or add new tests. These efforts, often attempted relatively late in the verification cycle, can consume vast amounts of time and resources while still failing to reach parts of the design. In contrast, automatic formal check can be used to identify unreachable code early in the verification cycle. These targets can be eliminated from the coverage model. As a result, the coverage measurement is more accurate and you know when you are done.

The next area is design initialization. If a design cannot be initialized reliably in silicon, it will not function correctly. An obvious precursor then is making sure all the registers are initialized correctly at RTL. If X’es are used, we need to monitor the X creation, propagation and usage cycle. Dynamic simulation does not interpret X’es accurately as in silicon, which has only 1s and 0s. Automatic formal check is ideal in verifying register initialization under different modes or configurations. Then, with internal assertions and formal technologies, we can check that although X’es are created, they are not used by downstream registers.

The final area is corner case design issues. Time and time again, designers unintentionally write code that violates logical correctness. Examples include combinational loops, full case violations, parallel case violations, undriven logic, finite-state machine (FSM) deadlocks and FSM livelocks. Unless tests are written to specifically target these corner case design issues are, such issues are difficult to exercise. On the other hand, by formally analyzing the design semantics, automatic formal check identifies these design issues statically and creates the stimuli to highlight them to the users.

If you are interested to know more about the automatic formal check feature in 0-In Formal, please feel free to register for our upcoming seminar in San Jose.

 

, , , , , ,

7 June, 2010

After spending years verifying ASICs with dynamic simulation, I started working on static verification 10 years ago in a startup called 0-In Design Automation. I firmly believe that static verification can complement dynamic simulation. Static verification uses synthesis and formal technologies to find bugs in the design. It does not rely on simulation stimulus. You do not need to exercise the bugs, propagate the results, and check the outputs to detect them.

Static verification includes RTL lint, static checks, formal checks, automated and assertion-based formal property checking. To read more on static verification, you can take a look at my white paper: Getting Started With Static Verification. If you are interested in formal methods, you can take a look at Harry Foster’s white paper: Why Now for Formal Property Checking. Both can be found in the Knowledge Center of DAC.com.

In the future, we are going to talk about individual static verification technologies and its application in areas such as RTL verification, clock domain crossing verification, low power verification, timing constraint verification, etc. Your feedback and comments are most welcome.

, , , , ,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@HarryAtMentor Tweets

  • Loading tweets...