Posts Tagged ‘IEEE 1800’

4 June, 2015

Learn more about DDA at DAC

At DAC – Mentor Graphics and Cadence Design Systems are coming together to usher in another level of productivity in verification results data access and portability with a modern design debug data application programming interface standard. We call this emerging standard the Debug Data API, or DDA for short.  We want to share more details with you in person at DAC.  Join us on Tuesday, June 9th, at the Verification Academy Booth (#2408) at 5:00pm for a joint presentation and unveiling.  And to get a bit of background and hint of what’s to come, please read on.

History: It Started with VCD

In the beginning we had VCD as the universal standard format to exchange simulation results as part of the IEEE 1364 (Verilog) standard.   Anyone trying to use VCD today on those large SoC’s or complex FPGA’s knows the size of VCD files has all but excluded this portion of the IEEE standard from use in modern design verification practice. So the question is when will it be replaced?

To ask that question today seems fine.  But I was even skeptical in the mid 1990’s when we at Mentor Graphics created Extended VCD to support the IEEE 1076.4 (VITAL) gate level simulation standard.  At that time the largest designs were around 1 million gates. While Extended VCD never became an official IEEE standard, we shared it with our ASIC Vendor and FPGA partners along with our major competitors to ensure debug data access and portability for VITAL users was on par with Verilog.  But Extended VCD also suffers the same fate of being almost impossible to support modern large designs.

Today: VCD Replaced by a Proprietary World

VCD and Extended VCD have remained static for about 20 years. But commercial simulator, emulator and other verification technology suppliers have not stopped innovating to advance support for larger design sizes with larger result data sets. As we move to 2 billion gate designs and beyond, the dependence on these private and closed technological advances and innovations has never been more important.

But that proprietary dependence comes with a cost. We stand at a crossroads where consumers of verification results information lose the open and unencumbered use offered by VCD or they need a path forward that preserves their current benefits while protecting and encouraging producers of such information to continue to innovate by private means.  The only alternative are fully integrated solutions from a single supplier that rarely get consumer endorsement in a best-of-breed; mix-and-match world.

Near Future: Federating the Proprietary World with DDA

Federating proprietary solutions almost sounds like something that is impossible to do. But Mentor and Cadence will share their emerging work on a standard to federate the different sources of verification results that can come from private sources with unencumbered access for the consumer. The Debug Data API standard will offer consumers the benefits of VCD interoperability, data portability and openness while preserving the benefits of private innovation for tool and solution producers. It will not impose data format translations from one format to another as a means to promote data portability.  It will not require the means by which one supplier or another stores verification results to be exposed.  It will offer the best of both worlds to producers and consumers.  I guess in some cases, you can have your cake and eat it too! There are more details to share, and the best place start is to meet us at DAC.

VA DDA Session AbstractMentor and Cadence Share DDA Details at DAC

  • Location: Verification Academy Booth (#2408)
  • Date: Tuesday, June 9th
  • Time: 5:00pm PT
    More Information >

We will discuss the details of DDA and show a proof of concept demonstration that will highlight each company’s simulator and results viewer in action.  Since there is no other session following this one at the Verification Academy booth, we will also be around to discuss the next steps with all present afterwards.

You can read more about this from my colleague and competitor, Cadence’s Adam Sherer, on his blog at the Cadence site here.  He bring his own perspective to this.

See you at DAC!

, , , , , , , , , , , ,

3 June, 2015

FPGA Language and Library Trends

This blog is a continuation of a series of blogs related to the 2014 Wilson Research Group Functional Verification Study (click here). In my previous blog (click here), I focused on FPGA verification techniques and technologies adoption trends, as identified by the 2014 Wilson Research Group study. In this blog, I’ll present FPGA design and verification language trends, as identified by the Wilson Research Group study.

You might note that the percentage for some of the language and library data that I present sums to more than one hundred percent. The reason for this is that many FPGA projects today use multiple languages.

FPGA RTL Design Language Adoption Trends

Let’s begin by examining the languages used for FPGA RTL design. Figure 1 shows the trends in terms of languages used for design, by comparing the 2012 Wilson Research Group study (in dark blue), the 2014 Wilson Research Group study (in light blue), as well as the projected design language adoption trends within the next twelve months (in purple). Note that the language adoption is declining for most of the languages used for FPGA design with the exception of Verilog and SystemVerilog.

Also, it’s important to note that this study focused on languages used for RTL design. We have conducted a few informal studies related to languages used for architectural modeling—and it’s not too big of a surprise that we see increased adoption of C/C++ and SystemC in that space. However, since those studies have (thus far) been informal and not as rigorously executed as the Wilson Research Group study, I have decided to withhold that data until a more formal study can be executed related to architectural modeling and virtual prototyping.

Figure 1. Trends in languages used for FPGA design

It’s not too big of a surprise that VHDL is the predominant language used for FPGA RTL design, although the projected trend is that Verilog will likely overtake VHDL in terms of the predominate language used for FPGA design in the near future.

FPGA Verification Language Adoption Trends

Next, let’s look at the languages used to verify FPGA designs (that is, languages used to create simulation testbenches). Figure 2 shows the trends in terms of languages used to create simulation testbenches by comparing the 2012 Wilson Research Group study (in dark blue), the 2014 Wilson Research Group study (in light blue), as well as the projected verification language adoption trends within the next twelve months (in purple).

Figure 2. Trends in languages used in verification to create FPGA simulation testbenches

FPGA Testbench Methodology Class Library Adoption Trends

Now let’s look at testbench methodology and class library adoption for FPGA designs. Figure 3 shows the trends in terms of methodology and class library adoption by comparing the 2012 Wilson Research Group study (in dark blue), the 2014 Wilson Research Group study (in light blue), as well as the projected verification language adoption trends within the next twelve months (in purple).

Figure 3. FPGA methodology and class library adoption trends

Today, we see a downward trend in terms of adoption of all testbench methodologies and class libraries with the exception of UVM, which has increased by 28 percent since 2012. The study participants were also asked what they plan to use within the next 12 months, and based on the responses, UVM is projected to increase an additional 20 percent.

FPGA Assertion Language and Library Adoption Trends

Finally, let’s examine assertion language and library adoption for FPGA designs. The 2014 Wilson Research Group study found that 44 percent of all the FPGA projects have adopted assertion-based verification (ABV) as part of their verification strategy. The data presented in this section shows the assertion language and library adoption trends related to those participants who have adopted ABV.

Figure 4 shows the trends in terms of assertion language and library adoption by comparing the 2010 Wilson Research Group study (in dark blue), the 2012 Wilson Research Group study (in green), and the projected adoption trends within the next 12 months (in purple). The adoption of SVA continues to increase, while other assertion languages and libraries either remain flat or decline.

Figure 4. Trends in assertion language and library adoption for FPGA designs

In my next blog (click here), I will continue presenting findings from the 2014 Wilson Research Group Functional Verification Study.

Quick links to the 2014 Wilson Research Group Study results

, , , , , , , ,

3 April, 2015

It is always good to pause to recognize the companies and individuals with whom we collaborate to create the verification flows and solutions that allow the simplest and most complex devices and systems to come to life.  It is this time of year when the fruit of collaboration has generally been shared publicly.  This is probably the case, in no small part, to the nearing of the annual trek to the Design Automation Conference (DAC).  As we get closer to that week in June this year, I will discuss it even more.  But now I would like to offer a look back at two major milestones around this time of the year that shaped our future.

20 Years Ago

On April 3, 1995, we announced “Device Vendors Providing Library Support to Mentor.”  Our ModelSim simulator gained support from 12 ASIC and programmable logic vendors.  Until then, Mentor’s gate-level simulation was provided by QuickSim and its large collection of ASIC vendor libraries and flows.  With the emergence of VITAL (VHDL Initiative Towards ASIC Libraries) and as an IEEE standards project for it (1976.4) emerged, we continued our activities to drive knowledge about VITAL and educate and help the rest of the ASIC vendor community so they could bring to market their own simulation libraries for ModelSim.

As we added Verilog to the language mix, those Verilog libraries were likewise qualified and offered to the mutual customers we shared with our valued ASIC Vendor partners.  ModelSim grew to be a very popular product and the value of collaboration taught us the importance of shared collaboration.

10 Years Ago

In mid May 2005, we launched our Questa Vanguard Partnership (QVP) program modeled on the ModelSim program.  SystemVerilog 3.1a had been released by Accellera and was in the final stages of IEEE certification which was to come in November 2005.  But to get a jump on solidifying business relationships with our partners and to encourage support of SystemVerilog we began to work with companies around the world who expressed an interest to build a vibrant ecosystem.  A lot was accomplished in the six months between the launch of the QVP program to the approval of the first IEEE SystemVerliog 1800-2005 standard.

But it was good to pause then too and celebrate the standard with our new Questa partners, our mainstay semiconductor library partners and competitors in Japan.  Upon IEEE approval of the standard, Accellera in conjunction with the Big-3 EDA companies and CQ Publishing (Japan), held a “Happy Birthday” celebration reception.  I have to offer special thanks to my friends at Synopsys for the idea.  And, yes, we all know that this November will be lucky 10 years for SystemVerilog and we have already started to discuss what can be done at the annual fall standards meetings in Japan to celebrate this milestone.

Tomorrow (DAC)

As I mentioned, the great thing about this time of the year is the planning for DAC.  Many good things have happened in the last year.  Last year, at Mentor Graphics’ urging and our public commitment to donate technology, Accellera started a “Proposed Working Group” on Portable Stimulus to determine the viability of a standards project.  Accellera formally approved the formation of the Portable Stimulus Working Group in December 2014.  At the Verification Academy booth at DAC, we will certainly offer updates on this work and affirm our sustained commitment to the development of this standard.  I will share full details about what, when and where for the Verification Academy booth at DAC later.

But wait!  There will probably be more.  I can assure you, I will post a few more times during this final two-month journey to DAC.  And as the daily program for the Verification Academy booth is finalized, I will share its content my thoughts on this.  And as industry events, like the Accellera DAC Breakfast are finalized, I will make this part of my commentary on DAC 2015 as well.  It seems this DAC will be a busy DAC.

But this is something you can do now!  If you don’t know if you want to attend the technical program yet, you should at a minimum secure a free pass to the exhibit floor and access to some open industry events.  If you register by May 19th, you can choose the “I Love DAC” registration – complements of ATopTech, Atrenta, and Calypto.  After May 19th, it is no longer free.  So why not register now?  I look forward to seeing you at DAC.

, , , , , , , , , , ,

11 September, 2014

From those just beginning to study electronic systems design to the practicing engineer, this is the time of the year when those taking their first steps to learn VHDL, Verilog/SystemVerilog join the academic “back to school” crowd and those who are using design & verification languages in practice are honing skills at industry events around the world.

A new academic year has started and the Mentor Higher Education Program (HEP) is well set to help students at more than 1200 colleges and universities secure access to the same commercial tools and technology used by industry.  It is a real win-win when students learn using the same tools they will use after graduating.  Early exposure and use means better skilled and productive engineers for employers.

The functional verification team at Mentor Graphics knows that many students would prefer to have a local copy of ModelSim on their personal computer to do their course work and smaller projects as they learn VHDL or Verilog.  To help facilitate that we make the ModelSim PE Student Edition available for download without charge.  More than 10,000 students use ModelSim PE Student Edition around the world now in addition to our commercial grade tools they can access in their university labs.

For the practicing engineer, the Verification Academy offers an online community of more than 25,000 design and verification engineers that exchange ideas on a wide variety issues across the numerous standards and methodologies.  If you are not a member of the Verification Academy, I recommend you join.  You will also find the Verification Academy at DAC for one-on-one discussions and even more recently Verification Academy Live daylong seminars which came to Austin and which will be in Santa Clara – as of the writing of this blog.  There is still time to register for the Santa Clara event and I invite you to attend.

As design and verification is global, Accellera realized that DVCon should explore the needs of the global design and verification engineer population as well.  For 2014, DVCon Europe and DVCon India were born from an already successful running SystemC User Group events.  These user-led conferences will be held so engineers in these areas can more easily come together to share experiences and knowledge to ultimately become more productive.

Students and practicing engineers alike can benefit from fee-free access to some of the popular IEEE EDA standards.   While I don’t think reading them alone is the ultimate way to educate yourself, they make great companions to daily design and verification activities.  Accellera has worked with the IEEE to place several EDA standards in the IEEE Standards Association’s “Get™” program.  Almost 16,000 copies of the SystemC standard (1666) and just about the same number of SystemVerilog standards (1800) have been downloaded as of the end of August 2014.  Have you download your free copies yet?

The chart below shows the distribution of nearly 45,000 downloads which have occurred since 2010.  Stay tuned for breaking news on some updates to the EDA standards in the Get program.  When updated, they will replace the versions available now.  So if you want to have the current versions and the ones to come out shortly, you better download your copies now.  If the electronic version is not sufficient for you, the IEEE continues to sell printed versions.

image

From students to practicing engineers, the season of learning has started.  I encourage you to find your right venue or style of learning and connect with others to advance and improve your design and verification productivity.

, , , , , , , , , ,

8 September, 2013

Schedules, respins, and bug classification

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 11 click here), I focused on some of the 2012 Wilson Research Group findings related to formal verification, acceleration/emulation, and FPGA prototyping trends. In this blog, I present verification results findings in terms of schedules, respins, and classification of functional bugs.

We have seen in previous blogs that a significant amount of effort is being applied to functional verification. An important question the various studies have tried to answer is whether this increasing effort is paying off.

Schedules

Figure 1 presents the design completion time compared to the project’s original schedule. What is interesting is that we really have not seen a change in this trend in over five years. That is, 67 percent of all projects are behind schedule with respect to the original plan. One could argue that designs have increased in complexity in terms of gate counts, embedded processors, and lots of software between 2007 and 2012. Yet, achieving project schedules has not worsened.

Figure 1. Non-FPGA design completion time compared to the project’s original schedule

What’s interesting is that the FPGA designs follows this same trend, as shown in figure 2.

Figure 2. Non-FPGA vs. FPGA design completion time relative to the original plan

Respins

Other verification data points worth looking at relate to the number of spins required between the start of a project and final production. Figure 3 shows this industry trend all the way back to the 2004 Collett study. Again, even though designs have increased in complexity, the data suggest that projects are not getting any worse in terms of the number of spins before production. If anything, there appears to be a slight improvement recently in this trend in projects requiring three or more spins.

Figure 3. Number of spins required from start of project until production

Bug classification

Figure 4 shows various categories of flaws that are contributing to respins. Note that the sum is greater than 100% on this graph, which is because a respin can be triggered by multiple flaws. 

Figure 4. Number of spins required from start of project until production

Although logic and functional flaws remain the leading causes of respins, the data suggest that there has been some improvement in this area over the past eight years. Is this due to the increased amount of reuse that is occurring in the industry? Or is the industry maturing its verification processes? Or is something entirely different going on? This data point raises some interesting questions worth exploring.

Figure 5 examines root cause of functional bugs by various categories. The data suggest an improvement in logic errors over an eleven year period, and potentially, a worsening of problems related to changing specifications. Problems associated with changing, incorrect, and incomplete specifications is a common theme I often hear when visiting customers.

Figure 5. Root cause of functional flaws

In my next blog (click here), I plan to wrap up this series of blogs in what I call the Epilogue—which will discuss potential gotchas and cautions on interpreting certain aspects of the data and thoughts about how the data might be used constructively.

, , ,

5 August, 2013

Language and Library Trends

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 7 click here), I focused on some of the 2012 Wilson Research Group findings related to testbench characteristics and simulation strategies. In this blog, I present design and verification language trends, as identified by the Wilson Research Group study.

You might note that for some of the language and library data I present, the percentage sums to more than one hundred percent. The reason for this is that some participants’ projects use multiple languages.

RTL Design Languages

Let’s begin by examining the languages used for RTL design. Figure 1 shows the trends in terms of languages used for design, by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), the 2012 Wilson Research Group study (in green), as well as the projected design language adoption trends within the next twelve months (in purple) as identified by the study participants. Note that the design language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption continues to increase.

Also, it’s important to note that this study focused on languages used for RTL design. We have conducted a few informal studies related to languages used for architectural modeling—and it’s not too big of a surprise that we see increased adoption of C/C++ and SystemC in that space. However, since those studies have (thus far) been informal and not as rigorously executed as the Wilson Research Group study, I have decided to withhold that data until a more formal blind study can be executed related to architectural modeling and virtual prototyping.

Figure 1. Trends in languages used for Non-FPGA design

Let’s now look at the languages used specifically for FPGA RTL design. Figure 2 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 2. Languages used for Non-FPGA design

It’s not too big of a surprise that VHDL is the predominant language used for FPGA RTL design, although we are starting to see increased interest in SystemVerilog.

Verification Languages

Next, let’s look at the languages used to verify Non-FPGA designs (that is, languages used to create simulation testbenches). Figure 3 shows the trends in terms of languages used to create simulation testbenches by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green).

Figure 3. Trends in languages used in verification to create Non-FPGA simulation testbenches

The study revealed that verification language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption is increasing. In fact, SystemVerilog adoption increased by 8.3 percent between 2010 and 2012.

Figure 4 provides a different analysis of the data by partitioning the projects by design size, and then calculating the adoption of SystemVerilog for creating testbenches by size. The design size partitions are represented as: less than 5M gates, 5M to 20M gates, and greater than 20M gates. Obviously, we find that the larger the design size, the greater the adoption of SystemVerilog for creating testbenches. Yet, probably the most interesting observation we can make from examining Figure 4 is related to smaller designs that are less than 5M gates. Here we see that 58.8 percent of the industry has adopted SystemVerilog for verification. In other words, it is safe to say that SystemVerilog for verification has become mainstream today and not just limited to early adopters or leading-edge design projects.

Figure 4. SystemVerilog (for verification) adoption by design size

Let’s now look at the languages used specifically for FPGA RTL design. Figure 5 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 5. Trends in languages used in verification to create FPGA simulation testbenches

In my next blog (click here), I’ll continue the discussion on design and verification language trends as revealed by the 2012 Wilson Research Group Functional Verification Study.

, , , , , , , , , , , , , , , , , ,

29 July, 2013

Testbench Characteristics and Simulation Strategies

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for background on the study, click here).

In my previous blog (click here), I focused on the controversial topic of effort spent in verification. In this blog, I focus on some of the 2012 Wilson Research Group findings related to testbench characteristics and simulation strategies. Although I am shifting the focus away from verification effort, I believe that the data I present in this blog is related to my previous blog and really needs to be considered when calculating effort.

Time Spent in full-chip versus Subsystem-Level Simulation

Let’s begin by looking at Figure 1, which shows the percentage of time (on average) that a project spends in full-chip or SoC integration-level verification versus subsystem and IP block-level verification. The mean time performing full chip verification is represented by the dark green bar, while the mean time performing subsystem verification is represented by the light green bar. Keep in mind that this graph represents the industry average. Some projects spend more time in full-chip verification, while other projects spend less time.

Figure 1. Mean time spent in full chip versus subsystem simulation

Number of Tests Created to Verify the Design in Simulation

Next, let’s look at Figure 2, which shows the number of tests various projects create to verify their designs using simulation. The graph represents the findings from the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green). Note that the curves look remarkably similar over the past five years. The median number of tests created to verify the design is within the range of (>200 – 500) tests. It is interesting to see a sharp percentage increase in the number of participants who claimed that fewer tests (1 – 100) were created to verify a design in 2012. It’s hard to determine exactly why this was the case—perhaps it is due to the increased use of constrained random (which I will talk about shortly). Or perhaps there has been an increased use of legacy tests. The study was not design to go deeper into this issue and try to uncover the root cause. This is something I intend to informally study this next year through discussions with various industry thought leaders.

Figure 2. Number of tests created to verify a design in simulation

Percentage of Directed Tests versus Constrained-Random Tests

Now let’s compare the percentage of directed testing that is performed on a project to the percentage of constrained-random testing. Of course, in reality there is a wide range in the amount of directed and constrained-random testing that is actually performed on various projects. For example, some projects spend all of their time doing directed testing, while other projects combine techniques and spend part of their time doing directed testing—and the other part doing constrained-random. For our comparison, we will look at the industry average, as shown in Figure 3. The average percentage of tests that were directed is represented by the dark green bar, while the average percentage of tests that are constrained-random is represented by the light green bar.

Figure 3. Mean directed versus constrained-random testing performed on a project

Notice how the percentage mix of directed versus constrained-random testing has changed over the past two years.Today we see that, on average, a project performs more constrained-random simulation. In fact, between 2010 and 2012 there has been a 39 percent increase in the use of constrained-random simulation on a project. One driving force behind this increase has been the maturing and acceptance of both the SystemVerilog and UVM standards—since two standards facilitate an easier implementation of a constrained-random testbench. In addition, today we find that an entire ecosystem has emerged around both the SystemVerilog and UVM standards. This ecosystem consists of tools, verification IP, and industry expertise, such as consulting and training.

Nonetheless, even with the increased adoption of constrained-random simulation on a project, you will find that constrained-random simulation is generally only performed at the IP block or subsystem level. For the full SoC level simulation, directed testing and processor-driven verification are the prominent simulation-based techniques in use today.

Simulation Regression Time

Now let’s look at the time that various projects spend in a simulation regression. Figure 4 shows the trends in terms of simulation regression time by comparing the 2007 Far West Research study (in gray) with the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green). There really hasn’t been a significant change in the time spent in a simulation regression within the past three years. You will find that some teams spend days or even weeks in a regression. Yet today, the industry median is between 8 and 16 hours, and for many projects, there has been a decrease in regression time over the past few years. Of course, this is another example of where deeper analysis is required to truly understand what is going on. To begin with, these questions should probably be refined to better understand simulation times related to IP versus SoC integration-level regressions. We will likely do that in future studies—with the understanding that we will not be able to show trends (or at least not initially).

Figure 4. Simulation regression time trends

In my next blog (click here), I’ll focus on design and verification language trends, as identified by the 2012 Wilson Research Group study.

, , , , , , , , , ,

22 July, 2013

Effort Spent On Verification (Continued)

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (click here), I focused on the controversial topic of effort spent in verification. This blog continues that discussion.

I stated in my previous blog that I don’t believe there is a simple answer to the question, “how much effort was spent on verification in your last project?” I believe that it is necessary to look at multiple data points to truly get a sense of the real effort involved in verification today. So, let’s look at a few additional findings from the study.

Time designers spend in verification

It’s important to note that verification engineers are not the only project members involved in functional verification. Design engineers spend a significant amount of their time in verification too, as shown in Figure 1.

Figure 1. Average (mean) time design engineers spend in design vs. verification

In fact, you might note that design engineers now actually spend more time doing verification than design. This time expenditure has shifted in the last five years. In fact, the amount of time that design engineers spend doing verification has increased by 15 percent since 2007, while the amount of time they spend doing design has decreased by about 13 percent.

The designer’s involvement in verification ranges from:

  • Small sandbox testing to explore various aspects of the implementation
  • Full functional testing of IP blocks and SoC integration
  • Debugging verification problems identified by a separate verification team

Percentage of time verification engineers spends in various task

Next, let’s look at the mean time verification engineers spend in performing various tasks related to their specific project. You might note that verification engineers spend most of their time in debugging. Ideally, if all the tasks were optimized, then you would expect this. Yet, unfortunately, the time spent in debugging can vary significantly from project-to-project, which presents scheduling challenges for managers during a project’s verification planning process.

Figure 2. Average (mean) time verification engineers spend in various task

Number of formal analysis, FPGA prototyping, and emulation Engineers

Functional verification is not limited to simulation-based techniques. Hence, it’s important to gather data related to other functional verification techniques, such as the number of verification engineers involved in formal analysis, FPGA prototyping, and emulation.

Figure 3 presents the trends in terms of the number of verification engineers focused on formal analysis on a project. In 2007, the mean number of verification engineers focused on formal analysis on a project was 1.68, while in 2010 the mean number increased to 1.84. For some reason, we did see a slight decreased in the mean number of verification engineers who focus on formal in 2012. Regardless, the curve is remarkably consistent for the past five years.

Figure 3. Median number of verification engineers focused on formal analysis

Although FPGA prototyping is a common technique used to create platforms for software development, it is also sometimes used by projects for SoC integration verification and system validation. Figure 4 presents the trends in terms of the number of verification engineers focused on FPGA prototyping. In 2007, the mean number of verification engineers focused on FPGA prototyping on a project was 1.42, while in 2010 the mean number was 1.86. In 2012 we saw a slight decline in mean number of verification engineers focused on FPGA prototyping. However, the curve has been remarkably similar for the past five years.

Figure 4. Number of verification engineers focused on FPGA prototyping

Figure 5 presents the trends in terms of the number of verification engineers focused on hardware-assisted acceleration and emulation. In 2007, the mean number of verification engineers focused on hardware-assisted acceleration and emulation on a project was 1.31, while in 2010 the mean number was 1.86. In 2012, we see a slight decrease in the mean number of verification engineers who focus on hardware-assisted acceleration and emulation.

Figure 5. Number of verification engineers focused on emulation

Again, noticed how the curve has been consistent over the past five years. In other words, we are not seeing any big trends in terms of increased verification engineers focused predominately on formal, FPGA prototyping, and hardware-assisted acceleration and emulation. This trend was certainly not true for general verification engineers who focus on simulation-based techniques, as I presented in my previous blog, where we saw a 75 percent increase in the peak number verification engineers involved on a project within the past five years.

A few more thoughts on verification effort

So, can I conclusively state that 70 percent of a project’s effort is spent in verification today as some people have claimed? No. In fact, even after reviewing the data on different aspects of today’s verification process, I would still find it difficult to state quantitatively what the effort is. Yet, the data that I’ve presented so far seems to indicate that the effort (whatever it is) is increasing. And there is still additional data relevant to the verification effort discussion that I plan to present in upcoming blogs. However, in my next blog (click here), I shift the discussion from verification effort, and focus on some of the 2012 Wilson Research Group findings related to testbench characteristics and simulation strategies.

, , , , , , , , ,

8 May, 2013

 Design Trends

In my previous blog, I introduced the 2012 Wilson Research Group Functional Verification Study (click here). The objective of my previous blog was to provide background on this large, worldwide industry study. I will present the key findings from this study in a set of upcoming blogs. 

This blog begins the process of revealing the 2012 Wilson Research Group study findings by first focusing on current design trends.  Let’s begin by examining process geometry adoption trends, as shown in Figure 1.  Here, you will see trend comparisons between the 2007 Far West Research study (gray line), the 2010 Wilson Research Group study (blue line), and the 2012 Wilson Research Group study (green line).

Figure 1. Process geometry trends

Worldwide, the median process geometry size from the 2007 Far West Research study was about 90nm, while the median process geometry size is about 65nm in 2010. Today, the mean process geometry size for a typical project is about 45nm—although you can see that over a third of projects today are designing below 32nm.

In addition to the industry moving to smaller process geometries, the industry is also moving to larger design sizes as measured in number of gates of logic and datapath, excluding memories (which should not be a surprise). Figure 2 compares design sizes from the 2002 Collett study (dark blue line), the 2007 Far West Research study (gray line), the 2010 Wilson Research Group study (light blue line), and the 2012 Wilson Research Group study (green line).

Figure 2. Number of gates of logic and datapath trends, excluding memories

The study revealed that about a third of the non-FPGA designs today are less than 5M gates, while a third range in size between 5M to 20M gates, and about a third of all designs are larger than 20M gates.

It’s important to note here that the data on the mean design size trends does not reflect volume in terms of semiconductor production. For example, you could have fewer projects designing at a small geometry, yet they have higher volume in terms of production.

In Figure 3, I show the mean design size trends between the 2002 Collett study (dark blue line), the 2007 Far West Research study (gray line), the 2010 Wilson Research Group study (light blue line), and the 2012 Wilson Research Group study (green line). Obviously, gate counts have increased over the years, yet a significant number of designs continue to be developed with smaller (and larger) gate counts as indicated by the mean calculation. Another observation is that, as you would expect, the mean gate count trend is essentially following Moore’s law.

Figure 3. Mean design size trends

Figure 4 presents the current design implementation trends for non-FPGAs as identified by the survey participants.

 

Figure 4. Non-FPGA current design implementation trends

The data in Figure 4 presents trends in design implementation approaches for non-FPGA designs, ranging from the 2002 Collett study (dark blue bar), the 2004 Collet study (dark green bar),  the 2007 Far West Research study (gray bar), the 2010 Wilson Research Group study (blue bar), and the 2012 Wilson Research Group study (green bar). Note that the study seems to indicate that there is a downward trend in standard cell design implementation.

Figure 5. FPGA design implementation trends

For the 2012 study, we decided that we wanted to get a sense of the percentage of FPGA projects that target the very complex programmable SoC FPGAs that have recently emerged, which is shown in Figure 5. Examples of these programmable SoC FPGAs include: Xilinx’s Zynq, Altera’s Arria/Cydone, and Microsemi’s SmarFusion.

In my next blog (click here), I’ll continue discussing current design trends, focusing specifically on embedded processors, power, and clock domains.

, , , , ,

25 February, 2013

Download the standard now – at no charge!

The IEEE has published the latest update to the SystemVerilog standard.  And courtesy of Accellera, the standard is available for download without charge directly from the IEEE.

1800-2012 1

The latest update to the SystemVerilog standard is now ready for download.  It joins other EDA standards, like SystemC in the IEEE Get™ program that grants public access to view and download current individual standards at no charge as a PDF.  (If you wish to have an older, superseded and withdrawn version of the standard or if you wish to have a printed copy or have it in a CD-ROM format, you can purchase older and alternate formats from IEEE for a fee.)

Over the years Accellera came to understand that many people continued to use the freely available version that seeded the initial IEEE 1800 SystemVerilog standard.  Since it is significantly out of date, Accellera collaborated with the IEEE Standards Association to ensure the latest version of the SystemVerilog standard would be freely available in electronic form to all whom wish to download it.  Accellera now hopes all those old 3.1a versions that everyone has and uses can now be placed in the archives.

The new version of standard should be used by the UVM (Universal Verification Methodology) community as the definitive specification of the SystemVerilog standard upon which UVM is built.   It goes very well with the UVM Cookbook and the Coverage Cookbook.

From Mentor’s perspective, it also makes a good companion to the Questa verification platform and complements our latest product update in which we announced support for the IEEE 1800-2012 SystemVerilog standard among other things.

If you have not done so already, download your copy now by clicking here.

, , , , , , ,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@jhupcey Tweets

  • Loading tweets...