Archive for April, 2011

21 April, 2011

User Adoption of OVM Featured; Views on UVM Discussed

The Mentor Graphics user group meeting, User-2-User, in Santa Clara is all set.  U2U will be held on 26 April 2011 at the Santa Clara Marriott and one of the tracks will feature functional verification after keynote presentations by Mentor’s CEO, Wally Rhines and Xilinx’s CTO, Ivo Bolsens.

Registration for the event is open and is fee-free.  U2U includes a complimentary lunch, evening reception and raffle along with the technical program.

The functional verification track opens with a presentation by Mentor’s Steve Chappell on how Mentor is “transforming verification” followed by three user presentations.  The three user presentations share a common theme that highlight how they have leveraged the Open Verification Methodology (OVM) to improve their verification productivity.  A couple presentations will also offer their views of the Accellera Universal Verification Methodology (UVM).

For users interested in the most current information on adoption and use of OVM and UVM, connecting with other users is probably the best source of unbiased information available.  Here are three presentations that can help you understand how three users get the most out OVM and UVM when coupled with Mentor technology.

User Presentations

Using OVM with Transaction and Emulation Based Simulation Acceleration
Galen Blake | Verification Lead | Altera
Using OVM (or UVM) with transaction and emulation based simulation acceleration platforms.  The OVM library is not quite as well suited to transaction and emulator based simulation acceleration.  This presentation examines the problems and approaches to address them.

Case Study “Proving OVM on a Real Design” – Testimonial by AppliedMicro
Shing Sheung Tse | Senior Verification Manager | AppliedMicro
This case study will present how APM overcame their verification challenges with Mentor’s advanced verification methodology and the decision to choose OVM.  APM will also present their view on UVM.

Verifying Bus Bridges with Questa Verification IP
Sudararajan Haran | Verification Lead | Microsemi
Microsemi moved to OVM-based verification environments and decided to use industry standard VIPs as much as possible.   This presentation will highlight Microsemi’s experiences using Mentor’s AHB and AXI VIP’s to drive the verification of our AHB_AXI and AXI_AHB bridges to a quicker completion.

, , , , , ,

20 April, 2011

Testbench Characteristics and Simulation Strategies (Continued)

This blog is a continuation of a series of blogs, which present the highlights from the 2010 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 6 click here), I focused on some of the 2010 Wilson Research Group findings related to testbench characteristics and simulation strategies. In this blog, I continue this discussion, and present additional findings specifically related to the number of tests created by a project, as well as the length of time spent in a simulation regression run for various projects.

Percentage directed tests created by a project

Let’s begin by examining the percentage of directed tests that were created by a project, as shown in Figure 1.  Here, we compare the results for FPGA designs (in grey) and non-FPGA designs (in green).

p7-slide1 

Figure 1. Percentage of directed testing by a project

Obviously, the study results are all over the spectrum, where some projects create more directed tests than others. The study data revealed that FPGA design participants tend to belong to a higher percentage of projects that only do directed tests.

Figure 2 shows the median number of directed tests created on a project by region, where North America (in blue), Europe/Israel (in green), Asia (in green), and India (in red). p7-slide2 

Figure 2. Median percentage of directed testing by a project by region

You can see from the results that India seems to spend less time focused on directed testing compared with other regions, which means that India spends more time with alternative stimulus generation methods (such as, constrained-random, processor-driven, or graph-based techniques).

Let’s look at the percentage of directed testing by design size, for non-FPGA projects.  The median results are shown in Figure 3, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

p7-slide3

Figure 3. Median percentage of directed testing by a project by design size

As design sizes increase, there is less reliance on directed testing.

Percentage of project tests that were random or constrained random

Next, let’s look at the percentage of tests that were random or constrained random across multiple projects. Figure 4 compares the results between FPGA designs (in grey) and non-FPGA designs (in green).

p7-slide4 

Figure 4. Percentage of random or constrained-random testing by a project

And again, the study results indicate that projects are all over the spectrum in their usage of random or constrained-random stimulation generation. Some projects do more, while other projects do less.

Figure 5 shows the median percentage of random or constrained-random testing by region, where North America (in blue), Europe/Israel (in green), Asia (in green), and India (in red).

p7-slide5 

Figure 5. Median percentage of random or constrained-random testing by region

You can see that the median percentage of random or constrained-random testing by a project is higher in Indian than other regions of the world.

Let’s look at the percentage of random or constrained-random testing by design size, for non-FPGA projects. The median results are shown in Figure 6, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

p7-slide6  

Figure 6. Median percentage of random or constrained-random testing by design size

Smaller designs tend to do less random or constrained-random testing.

Simulation regression time

Now, let’s look at the time that various projects spend in a simulation regression. Figure 7 compares the simulation regression time between FPGA designs (in grey) and non-FPGA designs (in green) from our recent study.

 p7-slide7

Figure 7. Time spent in a simulation regression by project

And again, we see that FPGA projects tend to spend less time in a simulation regression run compared to non-FPGA projects.

Figure 8 shows the trends in terms of simulation regression time by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green). There really hasn’t been a significant change in the time spent in a simulation regression within the past three years. You will find that some teams spend days or even weeks in a regression. Yet, the industry median is about 16 hours for both the 2007 and 2010 studies.

 p7-slide8

Figure 8. Simulation regression time trends

Figure 9 shows the median simulation regression time by region, where North America (in blue), Europe/Israel (in green), Asia (in green), and India (in red).

p7-slide9 

Figure 9. Median simulation regression time by regions

Finally, Figure 10, shows the median simulation regression time by design size, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

 p7-slide10

Figure 10. Median simulation regression time by design size

Obviously, project teams working on smaller designs spend less time in a simulation regression run compared to project teams working on larger designs.

In my next blog (click here), I’ll focus on design and verification language trends, as identified by the 2010 Wilson Research Group study.

, , ,

18 April, 2011

Testbench Characteristics and Simulation Strategies

This blog is a continuation of a series of blogs that present the highlights from the 2010 Wilson Research Group Functional Verification Study (for background on the study, click here).

In my previous blog (click here), I focused on the controversial topic of effort spent in verification. In this blog, I focus on some of the 2010 Wilson Research Group findings related to testbench characteristics and simulation strategies. Although I am shifting the focus away from verification effort, I believe that the data I present in this blog is related to the discussion of overall effort and really needs to be considered.

Time spent in top-level simulation

Let’s begin by examining the percentage of time a project spends in top-level simulation today. Figure 1 compares the time spent in top-level simulation between FPGA designs (in grey) and non-FPGA designs (in green) from our recent study.

p6-slide1 

Figure 1. Percentage time spent in top-level simulation

Obviously, the study results show that projects are all over the spectrum, where some projects spend less time in top-level simulation, while others spent much more time.

I decided to partition the data by design size (using the same partition that I’ve presented in previous blogs), and then compare the time spent in top-level verification by design size for Non-FPGA designs. Figure 2 shows the results, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

p6-slide2 

Figure 2. Percentage of time spent in top-level simulation by design size

You can see from the results that, unlike previous analysis where the data was partitioned by design size, it didn’t seem to matter for design sizes between 1M-20M gates and design sizes greater than 20M gates. That is, they seem to spend about the same amount of time in top-level verification.

Time spent in subsystem-level simulation

Next, let’s look at the percentage of time that a project spends in subsystem-level simulation (e.g., clusters, blocks, etc.). Figure 3 compares the time spent in subsystem-level simulation between FPGA designs (in grey) and non-FPGA designs (in green) from our recent study.

p6-slide3 

Figure 3. Percentage of time spent in subsystem-level simulation

Non-FPGA designs seem to spend more time in subsystem-level simulation when compared to FPGA designs. Again, this is probably not too surprising to anyone familiar with traditional FPGA development. Although, as the size and complexity of FPGA designs continue to increase, I suspect we will see more subsystem-level verification.  Unfortunately, we can’t present the trends on FPGA designs, as I mentioned in the background blog for the Wilson Research Group Functional Verification Study. However, future studies should be able to leverage the data we collected in the Wilson Research Group study and present FPGA trends.

Figure 4 shows the Non-FPGA results for the time spent in subsystem-level verification, where the data was partitioned by design size: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red). There doesn’t appear to be any big surprise when viewing the data with this partition.

p6-slide4 

Figure 4. Percentage of time spent in subsystem-level simulation by design size

Number of tests created to verify the design in simulation

Now let’s look at the percentage of projects in terms of the number of tests they create to verify a design using simulation. Figure 5 compares the number of tests created to verify a design between FPGA designs (in grey) and non-FPGA designs (in green) from our recent study.

p6-slide5 

Figure 5. Number of tests created to verify a design in simulation

Again, the data is not too surprising. We see that engineers working on FPGA designs tend to create fewer tests to verify their design in simulation.

Figure 6 shows the trend in terms of the number of tests created to verify a design in simulation by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green). You will note that there has been an increase in the number of tests in the last three years.  In fact, although there is a wide variation in the number of tests created by a project, the median calculation trend has grown from 342 to 398 tests.

p6-slide6 

Figure 6. Number of tests created to verify a design in simulation trends

I also did a regional analysis of the number of tests a non-FPGA project creates to verify a design in simulation, and the median results are shown in Figure 7, where North America (in blue), Europe/Israel (in green), Asia (in yellow), and India (in red). You can see that the median calculation is higher in Asia and Indian than North America and Europe/Israel.

 p6-slide7

Figure 7. Number of tests created to verify a design in simulation by region

Finally, I did a design size analysis of the number of tests a non-FPGA project creates to verify a design in simulation, and the median results are shown in Figure 8, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

p6-slide8 

Figure 8. Number of tests created to verify a design in simulation by design size

Obviously, larger designs require more tests to verify them in simulation.

In my next blog (click here), I’ll continue focusing on testbench characteristics and simulation strategies, and I’ll present additional data from the 2010 Wilson Research Group study.

, , , ,

15 April, 2011

Watch DVCon Co-Located Event Presentations

Two presentations from the second annual SystemC Day at DVCon 2011 are available now.  The first presentation is the keynote by Jim Hogan, serial EDA entrepreneur at Vista Ventures, LLC and the second is an introduction to the emerging IEEE Std. 1666™, SystemC standard by Jim Aynsley at Doulos.  SystemC Day brought users together to discuss the current state of the market for ESL design and the pending content of the SystemC standard that is current in final ballot by the IEEE.

To view the video presentations, you will need to register with the Open SystemC Initiative.

Jim Hogan, Vista Ventures LLC, California, USA
Keynote Presentation: “Navigating the SoC Era”

Abstract: SoCs are becoming ubiquitous in semiconductor development. Further, these SoCs are no longer processor-centric, and they are differentiated through the integration of design elements such as multi-CPU, multi-core, DSP cores, hardware accelerators, peripherals and software.

Industry expert and private investor Jim Hogan will discuss the semiconductor industry’s growing adoption of SoC design, and its reliance on diverse sources of hardware and software IP, developed both internally and externally.

John Aynsley, Doulos Ltd., UK
The New IEEE 1666 SystemC Standard

Abstract: The IEEE SystemC Standard is currently being revised and updated, with the new standard due to be published later in 2011. This new version of the SystemC standard will for the first time include the TLM-1 and TLM-2.0 libraries. Meanwhile, OSCI is working to ensure that the SystemC Proof-of-Concept simulator tracks any changes to the IEEE standard. This presentation will give a concise technical summary of the most important new and revised features in the SystemC standard, will give a behind-the-scenes insight into the rationale behind the changes, and will show examples to illustrate the new features in action.

, , , , , , ,

4 April, 2011

 

Effort Spent On Verification (Continued)

This blog is a continuation of a series of blogs, which present the highlights from the 2010 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (click here), I focused on the controversial topic of effort spent in verification. This blog continues this discussion.

I stated in my previous blog that I don’t believe there is a simple answer to the question, “how much effort was spent on verification in your last project.”  I believe that it is necessary to look at multiple data points to truly get a sense of the real effort involved in verification today. So, let’s look at a few additional findings from the study.

Time designers spend in verification

It’s important to note that verification engineers are not the only project members involved in functional verification. Design engineers spend a significant amount of their time in verification too, as shown in Figure 1.

5slide1  

Figure 1. Mean time designer spends in design vs. verification

In fact, one finding from our study is that the mean time a design engineer spends in verification has increased from an average of 46 percent in 2007, to 50 percent in 2010. The involvement of designers in verification ranges from:

  • Small sandbox testing to explore various aspects of the implementation
  • Full functional testing of IP blocks and SoC integration
  • Debugging verification problems identified by a separate verification team

Percentage of time verification engineers spend on various task

Next, let’s look at the mean time verification engineers spend on various task related to their specific project. You might note that verification engineers spend most of their time in debugging. Ideally, if all the tasks were optimized, then you would expect this. Yet, unfortunately, the time spent in debugging can vary significantly from project-to-project, which presents scheduling challenges for managers during a project’s verification planning process.

5slide2 

Figure 2. Mean time verification engineers spend in different task

Number of formal analysis, FPGA prototyping, and emulation Engineers

Functional verification is not limited only to simulation-based techniques. Hence, it’s important to gather data related to other functional verification techniques, such as the number of verification engineers involved in formal analysis, FPGA prototyping, and emulation.

Figure 3 presents the trends in terms of number of verification engineers focused on formal analysis. In 2007, the median number of verification engineers focused on formal analysis on a project was 1.68, while in 2010 the median number increased to 1.84.

5slide3 

Figure 3. Median number of verification engineers focused on formal analysis

Figure 4 presents the trends in terms of number of verification engineers focused on FPGA prototyping. In 2007, the median number of verification engineers focused on FPGA prototyping on a project was 1.42, while in 2010 the median number increased to 2.04. Although FPGA prototyping is a common technique used to create platforms for software development, it can be used for SoC integration verification and system validation.

5slide4 

Figure 4. Number of verification engineers focused on FPGA prototyping

Figure 5 presents the trends in terms of number of verification engineers focused on hardware assisted acceleration and emulation. In 2007, the median number of verification engineers focused on hardware assisted acceleration and emulation on a project was 1.31, while in 2010 the median number increased to 1.86.

5slide5 

Figure 5. Number of verification engineers focused on emulation

A few more thoughts on verification effort

So, can I conclusively state that 70 percent of a project’s effort is spent in verification today?  No.  In fact, even after reviewing the data on different aspects of today’s verification process, I would still find it difficult to state quantitatively what the effort is. Yet, the data that I’ve presented so far seems to indicate that the effort (whatever it is) is increasing. And there is still additional data relevant to the verification effort discussion that I plan to present in upcoming blogs. However, in my next blog (click here), I shift the discussion from verification effort, and focus on some of the 2010 Wilson Research Group findings related to testbench characteristics and simulation strategies.

, ,

3 April, 2011

 

Effort Spent On Verification

This blog is a continuation of a series of blogs, which present the highlights from the 2010 Wilson Research Group Functional Verification Study (click here). In my previous blog (click here), I focused on design and verification reuse trends. In this blog, I focus on the controversial topic of amount of effort spent in verification.

I have been on the technical program committee for many conferences over the past few years (DVCon, DAC, DATE, FDL, HLDVT, MTV . . .), and it seems that there was not a single verification paper that I reviewed that didn’t start with the phrase: “Seventy percent of a project’s effort is spent in verification…blah blah blah.” Yet I’ve always wondered, where did this number come from?  There has never been a reliable reference to the origin of this number, and certainly no credible studies that I am aware of.

I don’t believe that there is a simple answer to the question, “how much effort was spent on verification in your last project.”  In fact, I believe that it is necessary to look at multiple data points, derived from multiple questions, to truly get a sense of effort spent in verification.

Total Project Time Spent In Verification

To try to assess the effort spent in verification, let’s begin by looking at one data point, which is the total project time spent in verification. Figure 1 shows the trends in total project time spent in verification by comparing the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green).

slide13 

Figure 1. Percentage of total project time spent in verification

Notice that in 2007, the median total project time spent in verification was calculated to be 50 percent, while the number increased to 55 percent in 2010. Our recent study seems to indicate that the time spent in verification is increasing.

Peak Number of Verification Engineers

Next, let’s look at another data point, the peak number of verification engineers on a project. Figure 2 compares the peak number of verification engineers involved on FPGA designs (in grey) and non-FPGA designs (in green) from our recent study.

4slide2

Figure 2. Peak number of verification engineers

It’s not surprising that projects involving non-FPGA designs tend to have a higher number of peak verification engineers compared with FPGA designs.

Figure 3 shows the trends in peak number of verification engineers for non-FPGA designs by comparing the 2007 Far West Research study (in blue) with  the 2010 Wilson Research Group study (in green).

Rslide31 

Figure 3. Peak number of verification engineer trends

I decided that another interesting way to look at the data is to partition the set regionally, and calculate the median peak number of verification engineers on a project by region.  The results are shown in Figure 4, with North America (in blue), Europe/Israel (in green), Asia (in green), and India (in red).

4slide4 

Figure 4. Peak number of verification engineer by region

Noticed how, on average, India seems to have more peak engineers involved on a project. India certainly has developed a core set of expertise in verification over the past few year’s.

The next analysis I decided to perform was to partition the data by design size, and then compare the median peak number of verification engineers. Figure 5 shows the results, where the design size partitions are represented as: less than 1M gates (in blue), 1M to 20M gates (in orange), and greater than 20M gates (in red).

4slide51

Figure 5. Peak number of verification engineer by design size

Although I am focusing on effort spent in verification at the moment, let’s take a look at the peak number of design engineers involved on a project today. Figure 6 compares the peak number of design engineers involved on FPGA designs (in grey) and non-FPGA designs (in green).

slide61 Figure 6. Peak number of design engineers

Next, in Figure 7  I show the trends in peak number of design engineers for non-FPGA designs by comparing the 2007 Far West Research study (in blue) with  the 2010 Wilson Research Group study (in green).

 slide71

Figure 7. Peak number of design engineer trends

You might note that there has not been a significant increase in design engineers in the past three years, although design sizes have increased.  This is partially due to increased adoption of internal and external IP (as I discussed in my previous blog), as well as continued productivity improvements due to automation.

After I saw this data, I thought it would be interesting to compare the median increase in verification engineers to the median increase in design engineers from 2007 to 2010.  The results were shocking, as shown in Figure 8, where we see a four percent increase in peak number of design engineers in the last three years compared to a 58 percent increase in peak number of verification engineers. Clearly, verification productivity improvements are needed in the industry to address this problem.

 slide81

Figure 8. Peak number of design vs. verification engineer trends

In my next blog (click here), I’ll continue the discussion on effort spent in verification as revealed by the 2010 Wilson Research Group Functional Verification Study.

,

1 April, 2011

 

Reuse Trends

This blog is a continuation of a series of blogs, which presents the highlights from the 2010 Wilson Research Group Functional Verification Study (click here).  In my previous blog (click here), I  focused on embedded processors, power management, and clock domains.  In this blog, I focus on design and verification reuse trends. As I mentioned in my prologue blog to this series  (click here), one interesting trend that emerged from the study is that reuse adoption is increasing.

Design Composition Trends

Figure 1 shows the median design composition trends, which compares the 2007 Far West Research study (in blue) with  the 2010 Wilson Research Group study (in green).

Notice that new logic development has decreased by 34 percent in the last three years, while external IP adoption has increased by 69 percent. This increase in adoption has been driven by IP demand required for SoC development, such as embedded processor cores (e.g., ARM cores) and standard interface cores (e.g., USB cores).

 

slide11 

Figure 1. Median design composition trends

Verification Testbench Composition Trends

Figure 2 shows the median testbench composition trends, which compares the 2007 Far West Research study (in blue) with the 2010 Wilson Research Group study (in green).

Notice that new verification code development has decreased by 24 percent in the last three years, while external verification IP adoption has increased by 138 percent. This increase has been driven by the emergence of standard on-chip and off-chip bus architectures.

 

 slide21

Figure 2. Median testbench composition trends

In my next blog (click here), I’ll shift my focus from design trends to project resource trends. I’ll also present our findings on the project effort spent in verification.

,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...