Posts Tagged ‘Coverage’

7 May, 2014

My Feb. 4 post introduced Mentor Graphics’ three-step FPGA verification process intended to help design teams get out of the reprogrammable lab more effectively. Since then, I’ve engaged FPGA vendors, design managers and engineers to explain the process, paying special attention to the merits and technical detail for injecting automation into any FPGA verification environment, the hallmark of Mentor’s process. The feedback from these conversations helped me to develop a series of technical webinars, now available for free and on-demand. Check them out and let us know what you think in the comments below. My hope is the webinars might serve as a starting point for your own conversations on verification of FPGAs, demand for which seems to continue to grow as process nodes shrink.

Injecting Automation into Verification – FPGA Market Trends

Injecting Automation into Verification – Code Coverage

Injecting Automation into Verification – Assertions

Injecting Automation into Verification – Improved Throughput

, , , , , , , , ,

4 February, 2014

Marketing teams at FPGA vendors have been busy as the silicon nanometer geometry race escalates. Altera is “delivering the unimaginable” while Xilinx is offering “all programmable SoCs” to design centers. It’s clear that the SoC has become more accessible to a broader market today and that FPGA vendors have staked out a solid technology roadmap for the near future. Do marketing messages surrounding the geometry race effect day to day life of engineers, and if so, how – especially when it comes to verification?
An excellent whitepaper from Altera, “The Breakthrough Advantage for FPGAs with Tri-Gate Technology,” covers Altera’s Stratix 10 FPGAs and SoCs. The paper describes verification challenges in this new expanded market this way: “Although current generation FPGAs require a rigorous simulation verification methodology rivaling ASICs, the additional lab testing and ability to reprogram FPGAs save substantial manpower investment. The overall cost of ownership must be considered when comparing an FPGA whose component price is higher than an ASIC of similar complexity.” I believe you can use this statement to engage your management in a discussion about better verification processes.

Xilinx also has excellent published technical resources. Its recent UltraScale backgrounder describes how they are solving the challenges in implementing a design with their reprogrammable silicon. Clearly Xilinx has made an impressive investment to make it easier to implement a design with its FPGA UltraScale products. Improvements include ASIC-like clocking and annealing dataflow bottlenecks without compromising performance. Xilinx also describes improvements when using its Vivado design suite, particularly when it comes to in-lab design bring up.

For other FPGA insights, it’s also worth checking out Electronics Engineering Journal’s recent article “Proliferating Programmability in 2014,” which claims that the long-term future of FPGAs tool flows even though, as Kevin Morris sees it, EDA seems to have abandoned the market. (Kevin, I’m here to tell you you’re wrong.)

Do you think it’s inevitable that your FPGA team will first struggle to make it across the verification finish line before adopting a more process-oriented verification flow like the ASIC market demands? It’s not. I base this conclusion on the many conversations I’ve had with FPGA designers, their managers, sales engineers and many other talented people in this market over the years. Yes, there are significant challenges in FPGA design, but not all of them are technology related. With some emotion, one engineer remarked that debugging the same type of issue over and over in the hardware lab and expecting a different outcome was insane. (He’s right.) Others say they need specific ROI information for their management to even accept their need for change. Still others state that had they only known the solutions I talked about in my seminar a year ago, they would have not spent months and months bringing up their design in the lab.
With my peers here at Mentor Graphics, I have developed a three-step verification flow that includes coverage, assertions and improved throughput. I’ll write about this flow and related issues in the weeks ahead here on this blog. The flow is built on fundamental verification technologies that benefit the broad FPGA market. The goal, in developing the technology and writing about it here, has been to provide practical solutions and help more FPGA teams cross the verification gap.

In the meantime, what are your stories? Are you able to influence your management into adopting advanced technology to aid lab bring-up? Is your management’s bias towards lower cost and faster implementation (at the expense of verification)? Let me know in the comments or, if you prefer, by e-mail: joe_rodriguez@mentor.com.

, , , , , , , , ,

19 August, 2013

Verification Techniques & Technologies Adoption Trends

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for background on the study, click here).

In my previous blog (Part 9 click here), I focused on some of the 2012 Wilson Research Group findings related to design and verification language and library trends. In this blog, I present verification techniques and technologies adoption trends, as identified by the 2012 Wilson Research Group study.

An interesting trend we are starting to see is that the electronic industry is maturing its functional verification processes, whether they are targeting their designs at IC/ASIC or FPGA implementations. This blog provides data to support this claim. An interesting question you might ask is, “What is driving this trend?” In some of my earlier blogs (click here for Part 1 and Part 2) I showed an that design complexity is increasing in terms design sizes and number of embedded processors. In addition, I’ve presented trend data that showed an increase in total project time and effort spent in verification (click here for Part 5 and Part 6). My belief is that the industry is being forced to mature its functional verification processes to address increasing complexity and effort.

Simulation Techniques Adoption Trends

Let’s begin by comparing  non-FPGA adoption trends related to various simulation techniques from the 2007 Far West Research study  (in blue) with the 2012 Wilson Research Group study  (in green), as shown in Figure 1.

Figure 1. Simulation-based technique adoption trends for non-FPGA designs

You can see that the study finds the industry increasing its adoption of various functional verification techniques for non-FPGA targeted designs. Clearly the industry is maturing its processes as I previously claimed.

For example, in 2007, the Far West Research Group found that only 48 percent of the industry performed code coverage. This surprised me. After all, HDL-based code coverage is a technology that has been around since the early 1990’s. However, I did informally verify the 2007 results through numerous customer visits and discussions. In 2012, we see that the industry adoption of code coverage has increased to 70 percent.

In 2007, the Far West Research Group study found that 37 percent of the industry had adopted assertions for use in simulation. In 2012, we find that industry adoption of assertions had increased to 63 percent. I believe that the maturing of the various assertion language standards has contributed to this increased adoption.

In 2007, the Far West Research Group study found that 40 percent of the industry had adopted functional coverage for use in simulation. In 2010, the industry adoption of functional coverage had increased to 66 percent. Part of this increase in functional coverage adoption has been driven by the increased adoption of constrained-random simulation, since you really can’t effectively do constrained-random simulation without doing functional coverage.

Now let’s look at  FPGA adoption trends related to various simulation techniques from the 2010 Far West Research study  (in pink) with the 2012 Wilson Research Group study  (in red).

Figure 2. Simulation-based technique adoption trends for non-FPGA designs

Again, you can clearly see that the industry is increasing its adoption of various functional verification techniques for FPGA targeted designs. This past year I have spent a significant amount of time in discussions with FPGA project managers around the world. During these discussions, most mangers mention the drive to improve verification process within their projects due to  rising complexity of this class of designs. The Wilson Research Group data supports these claims.

In fact, Figure 3 illustrates this maturing trend in the FPGA space, where we saw a 15 percent increase in the adoption of RTL simulation and an 8.5 percent increase in the adoption of code coverage. For complex FPGA designs, the traditional approach of “burn and churn” and debug in the lab is no longer a viable option. Nonetheless, it is still somewhat alarming that 31 percent of the FPGA study participants work on projects that perform no RTL simulation.

Figure 3. FPGA projects maturing their verification processes

Signoff Criteria Trends

We saw earlier in this blog the increased adoption of coverage techniques in the industry. Coverage has become a major component of a project’s verification signoff criteria. In Figure 4, we see how coverage has increased in importance in verification signoff criteria within the past five years, while other decision attributes have declined in terms of importance.

Figure 4. Non-FPGA functional verification signoff criteria trends

We see the same trends for FPGA designs, as shown in Figure 5.

Figure 5. FPGA functional verification signoff criteria trends

In my next blog (click here), I plan to continue the discussion related to adoption of various verification technologies and techniques as identified by the 2012 Wilson Research Group study.

, , , , , , , , , ,

20 November, 2012

Verification Academy Adds Major New Technical Resource

The Verification Academy adds another major methodology cookbook to focus on effective coverage adoption.  The Coverage Cookbook describes the different types of coverage that are available to track your verification process progress, how to create a functional coverage model from a specification, and provides examples to implement functional coverage for different types of designs.

Verification Academy “full access” members have access to the free Coverage Cookbook and the UVM/OVM Cookbooks as well.  Are you a registered full access member?  If not, register now to become a full access member.  (Restrictions apply.)

Coverage is not a new topic.  It was one of major additions to the SystemVerilog (IEEE Std. 1800™-2009) standard.  But the SystemVerilog functional coverage extensions were left to the verification engineer to use in such as way to return meaningful measurements of how much of the design specification was being tested.  The Universal Verification Methodology (UVM) offers greater structure for coverage over SystemVerilog, but it too, is still only a piece of the puzzle.

imageAs verification teams have come to generate greater amounts of information from use of SystemVerilog, UVM and other verification tools, the data from the verification runs needs to be easily used to drive coverage closure.  Within the Mentor Graphics Questa verification platform, this resulted in the development of the Unified Coverage Database (UCDB) and associated verification management and planning features.

Since verification teams use a variety of tools and technology from many sources, it was an imperative that verification information could be easily shared and combined to help drive faster coverage closure across the industry.  This is why Mentor Graphics donated its UCDB API to Accellera where it became the Unified Coverage Interoperability Standard (UCIS).

It would be great to think that we are done; but we’re not.  Tools and data are just two dimensions of the three dimensions to any IC design project.  A comprehensive approach to verification management that handles all of this adds the third dimension.  The Mentor Graphics Questa Verification Management features handle all this.

Now the question is how to best adopt and use all the capabilities at hand from the standards to the verification technology at your finger tips.

The Verification Academy Coverage Cookbook is one of the important tools you now have to help pull all the information into a single place where you can learn the theory and put that theory into practice.  The Coverage Cookbook is much like the OVM/UVM Cookbooks in that it is web friendly, while supporting the ability for you to generate a PDF file of the whole document in case you want to have a printed copy or have it available for offline reference.

The Theory section covers:

  • What is coverage?
  • Kinds of coverage
  • Code Coverage
  • Functional Coverage
  • Specification to coverage
  • Coding for analysis

The Practice section shows three examples you can use today:

  • Bus protocol coverage using ARM® APB3
  • Block level coverage using UART
  • Datapath coverage using BiQuad IIR Filter

The Coverage Cookbook is a live document. You can expect continued extensions and contributions to enhance it.  As Harry Foster, Mentor Graphics’ Chief Scientist Verification put it, “Methodology is the bridge between tools and technologies, which creates a productive, predictable, and repeatable solution.”  We should expect that our collective use of this technology will help hone the methodology which is the heart of the Coverage Cookbook.  And with this use, we should expect the Coverage Cookbook to evolve as we achieve greater verification productivity.

Let us know what you think about the Coverage Cookbook and what we might be able to do to improve it.  In the meantime, Happy Coverage Closing!

, , , , , , , , , , , ,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...

@jhupcey Tweets

  • Loading tweets...