Posts Tagged ‘RTL’

5 August, 2013

Language and Library Trends

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 7 click here), I focused on some of the 2012 Wilson Research Group findings related to testbench characteristics and simulation strategies. In this blog, I present design and verification language trends, as identified by the Wilson Research Group study.

You might note that for some of the language and library data I present, the percentage sums to more than one hundred percent. The reason for this is that some participants’ projects use multiple languages.

RTL Design Languages

Let’s begin by examining the languages used for RTL design. Figure 1 shows the trends in terms of languages used for design, by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), the 2012 Wilson Research Group study (in green), as well as the projected design language adoption trends within the next twelve months (in purple) as identified by the study participants. Note that the design language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption continues to increase.

Also, it’s important to note that this study focused on languages used for RTL design. We have conducted a few informal studies related to languages used for architectural modeling—and it’s not too big of a surprise that we see increased adoption of C/C++ and SystemC in that space. However, since those studies have (thus far) been informal and not as rigorously executed as the Wilson Research Group study, I have decided to withhold that data until a more formal blind study can be executed related to architectural modeling and virtual prototyping.

Figure 1. Trends in languages used for Non-FPGA design

Let’s now look at the languages used specifically for FPGA RTL design. Figure 2 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 2. Languages used for Non-FPGA design

It’s not too big of a surprise that VHDL is the predominant language used for FPGA RTL design, although we are starting to see increased interest in SystemVerilog.

Verification Languages

Next, let’s look at the languages used to verify Non-FPGA designs (that is, languages used to create simulation testbenches). Figure 3 shows the trends in terms of languages used to create simulation testbenches by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green).

Figure 3. Trends in languages used in verification to create Non-FPGA simulation testbenches

The study revealed that verification language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption is increasing. In fact, SystemVerilog adoption increased by 8.3 percent between 2010 and 2012.

Figure 4 provides a different analysis of the data by partitioning the projects by design size, and then calculating the adoption of SystemVerilog for creating testbenches by size. The design size partitions are represented as: less than 5M gates, 5M to 20M gates, and greater than 20M gates. Obviously, we find that the larger the design size, the greater the adoption of SystemVerilog for creating testbenches. Yet, probably the most interesting observation we can make from examining Figure 4 is related to smaller designs that are less than 5M gates. Here we see that 58.8 percent of the industry has adopted SystemVerilog for verification. In other words, it is safe to say that SystemVerilog for verification has become mainstream today and not just limited to early adopters or leading-edge design projects.

Figure 4. SystemVerilog (for verification) adoption by design size

Let’s now look at the languages used specifically for FPGA RTL design. Figure 5 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 5. Trends in languages used in verification to create FPGA simulation testbenches

In my next blog (click here), I’ll continue the discussion on design and verification language trends as revealed by the 2012 Wilson Research Group Functional Verification Study.

, , , , , , , , , , , , , , , , , ,

26 July, 2010

For years one of the objectives in EDA has been to make formal property checking easy to use and its results easy to understand. With the Automatic formal check feature in the June release of the 0-In Formal tool version 3.0, I think we have made significant progress in this area.

The feature, which predefines a set of assertion rules to look for design issues automatically, makes formal technology accessible to users who are not yet ready to write properties in System Verilog Assertion (SVA) or Property Specification Language (PSL). To make it easier to comprehend problems in the design, the tool highlights the violations back to the RTL code.

Automatic formal check focuses on three areas inadequately addressed by dynamical simulation:

The first area is functional coverage. Today, when constrained random simulation fails to achieve the targeted coverage goal, engineers have to fine tune the environment or add new tests. These efforts, often attempted relatively late in the verification cycle, can consume vast amounts of time and resources while still failing to reach parts of the design. In contrast, automatic formal check can be used to identify unreachable code early in the verification cycle. These targets can be eliminated from the coverage model. As a result, the coverage measurement is more accurate and you know when you are done.

The next area is design initialization. If a design cannot be initialized reliably in silicon, it will not function correctly. An obvious precursor then is making sure all the registers are initialized correctly at RTL. If X’es are used, we need to monitor the X creation, propagation and usage cycle. Dynamic simulation does not interpret X’es accurately as in silicon, which has only 1s and 0s. Automatic formal check is ideal in verifying register initialization under different modes or configurations. Then, with internal assertions and formal technologies, we can check that although X’es are created, they are not used by downstream registers.

The final area is corner case design issues. Time and time again, designers unintentionally write code that violates logical correctness. Examples include combinational loops, full case violations, parallel case violations, undriven logic, finite-state machine (FSM) deadlocks and FSM livelocks. Unless tests are written to specifically target these corner case design issues are, such issues are difficult to exercise. On the other hand, by formally analyzing the design semantics, automatic formal check identifies these design issues statically and creates the stimuli to highlight them to the users.

If you are interested to know more about the automatic formal check feature in 0-In Formal, please feel free to register for our upcoming seminar in San Jose.

 

, , , , , ,

7 June, 2010

After spending years verifying ASICs with dynamic simulation, I started working on static verification 10 years ago in a startup called 0-In Design Automation. I firmly believe that static verification can complement dynamic simulation. Static verification uses synthesis and formal technologies to find bugs in the design. It does not rely on simulation stimulus. You do not need to exercise the bugs, propagate the results, and check the outputs to detect them.

Static verification includes RTL lint, static checks, formal checks, automated and assertion-based formal property checking. To read more on static verification, you can take a look at my white paper: Getting Started With Static Verification. If you are interested in formal methods, you can take a look at Harry Foster’s white paper: Why Now for Formal Property Checking. Both can be found in the Knowledge Center of DAC.com.

In the future, we are going to talk about individual static verification technologies and its application in areas such as RTL verification, clock domain crossing verification, low power verification, timing constraint verification, etc. Your feedback and comments are most welcome.

, , , , ,

22 April, 2010

Noted EDA analyst and guru Gary Smith delivered keynote address: “ESL: Where We Are and Where We’re Going”

OSCI sponsored the first annual SystemC Day at DVCon 2010.  The presentations were video recorded and are available for free for those who missed DVCon or who may wish to see them again.  Gary Smith’s presentation (registration required) and OSCI chair, Eric Lish’s OSCI Update lead the video set from SystemC Day.

The 12th North American SystemC Users Group (NASCUG) meeting was part of SystemC Day at DVCon and featured technical presentations on architectural modeling, verification, and analog/mixed-signal design using SystemC.

OSCI ON YOUTUBE: More videos of users and their perspectives on SystemC events and activities can be found via the OSCI channel on YouTube: http://www.youtube.com/officialsystemc

Click here for completing listing of the following technical presentations.

The Metaport: A Technique for Managing Code Complexity
Jack Donovan, HighIP Design, Texas, USA

The metaport is a coding style that can effectively manage code complexity for complex ESL models, especially models that are intended for high-level synthesis. This presentation will give an overview of the metaport concept and dive into the details of a possible implementation.

OCP Socket Modeling with TLM-2.0
Hervé Alexanian, Sonics Inc., California, USA

This presentation discusses work performed by the OCP-IP Committee, specifically modeling that OCP built upon the TLM-2.0 standard.

ADL Synthesis using ArchC
Samuel Goto, Master student at UNICAMP

The design and implementation of processors is a complex task. Architecture Description Languages (ADLs) were created to extend existing HDLs, to ease the process of developing and prototyping an architecture by providing a set of tools and algorithms to optimize and automate some of the tedious parts. While much has been done on the specification and business levels of ADLs, there is a huge gap between ADL specifications/simulators and real life processors written in RTL. This project addresses the issues of bringing an ADL description to the RTL level, and reports the development of an extension of ArchC to support this level.

Look Ma, No Clocks! Improving Model Performance
David Black, XtremeEDA, Texas, USA

This tutorial-style presentation illustrates some techniques to avoid the inclusion of clocks in SystemC simulations and provide results of simple experiments showing the simulation performance benefits. Concepts discussed include synchronization, clock-free timers, and the effects of clocks on performance. A proposal is made for a simple SystemC class that can simplify coding when clocks are thought to be needed.

TLM-driven Design and Verification Methodology
Brian Bailey, independent consultant

SystemC is well on the road to adoption in a number of areas within the Electronic System Level (ESL) space, but many of those are separated islands today. Virtual Prototyping has seen a huge leap forward with the standardization of TLM 2.0. SystemC is also being used successfully for high-level synthesis at the module level, but to make SystemC pervasive, there must be a link between the applications. In addition, to reap the maximum productivity gains from a migration to a higher-level of abstraction, the verification methodology must also change in significant ways. In this presentation we will explore a new TLM-driven design and verification methodology that is being developed within Cadence, critiqued by their customers and documented in a book, which will be released over a number of months as pieces of it mature.

, , , , , ,

11 September, 2009

I have lots of blog entries about 95% ready to publish. This entry is from an e-mail I wrote a few months ago when somebody asked about SystemVerilog coding guidelines. I thought it would make a good article. It’s been sitting as a draft because I always have trouble finding the right title or opening words to catch people’s attention. I couldn’t come up with anything better, so here it is: SystemVerilog Coding Guidelines

My official position about this is that you pick a style and stick to it.

Which style you pick is far less important than developing the required culture that will follow it. People can spend hours arguing over the merits of camelCaps versus under_score, but once the decision is made, people on the project need to document it, adhere to it, and police it. Otherwise you’re just wasting everyone’s time.

Why are coding guidelines so important? So someone else can read your code, or maybe you can read your own code six months after you wrote it. Good comments and documentation are an import part, but you still need to be able to read the code.

This concept of reading someone else’s code without understanding the underlying conventions reminds me of the classic short article “Meihem In Ce Klasrum” by Dolton Edwards. The article was written as a satirical response to George Bernhard Shaw’s final wishes to simplify the English language. See how easy the last paragraph is to read once you understand the new writing conventions. Reading your code should be just like that (although not as cryptic).

My advice would be to pick a style from industry and adapt it to fit SystemVerilog. For example, from Google: http://google-styleguide.googlecode.com/svn/trunk/cppguide.xml or from the Free Software Foundation: http://www.gnu.org/prep/standards/

All of these rules for naming conventions, indentation, and file organization still apply to SystemVerilog. Additional rules would be for unsupported constructs, which should be in the release notes for the tool(s) you are using. Finally, there are a few constructs to avoid, like program blocks and wildcard indexes for associative arrays, most of which might be worth hours of debate, each.

Dave Rich

, ,

7 July, 2009

I’ve been around simulation and synthesis languages for a while; back when you needed an NDA to see the Verilog LRM, and again with SUPERLOG, the predecessor to SystemVerilog.  It’s easy for those like me to get caught up in the features of the language and forget that any programming language is just a tool. With any technology, people pick the tools they think will get the job finished most effectively. Tools evolve to meet the challenges and requirements of their users. Verilog and VHDL have clearly evolved to become the prevailing languages for hardware design.

But before the language wars came the methodology wars. At the time when Verilog and VHDL were being introduced in the late 1980s, most hardware design was by schematic gale-level entry. We would come to our clients with our simulators and synthesis tools and try to change their design methodology by writing RTL. They would bring their best engineers to compete with our tools – and the engineer would always win by producing a design with better area and timing! However, once the productivity of synthesizing large designs with practical quality of results prevailed over the manual effort, the methodology shift was an easier sell.

Fast forward a decade – although designs have increased exponentially as predicted by Moore’s Law, RTL design has not changed in any significant way during that time. Why? Because it takes the same number of lines of RTL code to write an 8-bit adder as it does a 64-bit adder. :) OK, so that’s an over-simplification, but number of lines of RTL code written by a single design engineer has remained manageable. However, for every registered bit added to the design, the state space doubles, and the transition space for testing all permutations of the state space quadruples. Again that’s a simplification, but the correct order of magnitude.

During that period, a number of technologies have addressed the increasing complexities of verification, such as constrained random generation, coverage driven verification, and object-oriented programming. These technologies require a change in verification methodology from writing a linear set of test patterns.

Success in captivating verification engineers to these new methodologies is taking the same path of those earlier design engineers. A single verification engineer may find a single test to exercise a specific piece of functionally much quicker than it takes a constrained random test to reach that same function. But eventually, a constrained random test will exercise more functionality faster than an engineer can write individual tests. Functional Coverage fits into Constrained Random generation to help measure the quality of your tests by telling you if your constraints are working to exercise the functionality you are required to hit.  It takes the randomness out of Constrained Random generation.

Since the verification environment is more software design than hardware, Object-Oriented Programming is a technology that helps you write re-usable code, which in turn keeps your verification code manageable.

SystemVerilog has become the prevailing language that incorporates all these technologies. But does that mean the need for writing directed tests goes away. No. Designers still layout transistors or gates by hand where it’s critical to their project. Verification engineers should use the technology that is best suited to verify their design.

Firmware tests will continue to be written in C/C++ and SystemVerilog’s DPI can help link C based tests to the RTL. SystemC has become the prevailing modeling language for DSP and algorithmic based design. Most simulation tools can seamlessly link those models to RTL to either drive the test or compare results.

So next time you’re thinking about which language to choose to verify your design, step back and think about the methodology first.

Dave Rich

, , ,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...