Posts Tagged ‘vhdl’

7 October, 2015

ieee-sa-logo2Design and verification flows are multifaceted and predominantly built by bringing tools and technology together from multiple sources.   The tools from these sources build upon IEEE standards – several IEEE standards.  What started with VHDL (IEEE 1076™) and Verilog/SystemVerilog (IEEE 1800™) and their documented interfaces has grown.  As more IEEE standards emerged and tools and technology combined these standards in innovative and differentiated ways the industry would benefit from an ongoing open and public discussion on interoperability.  The IEEE Standards Association (IEEE-SA) continues with this tradition started by my friends at Synopsys with the IEEE-SA EDA & IP Interoperability Symposium.  And for 2015, I’m pleased to chair the event.

Anyone working on or using design and verification flows that depend on tool interoperability as well as design and verification intellectual property (IP) working together will benefit from attending this symposium.  The symposium will be held Wednesday, 14 October 2015, at the offices of Cadence Design Systems in San Jose, CA USA.  You can find more information about the event at the links below:

  • Register: Click here.
  • Event Information: Click here.
  • Event Program: Click here.

A keynote presentation by Dan Armbrust, CEO Silicon Catalyst, opens the event with a talk on Realizing the next growth wave for semiconductors – A new approach to enable innovative startups.  If you are one of the Silicon Valley innovators, you might like to hear what Dan shares on this next growth wave.  From my perspective, I suspect it will include being more energy conscious in how we design.  The work on current and emerging IEEE standards that address those energy concerns will follow.  We will review what the conclusions were from the DAC Low Power Workshop and leadership from the IEEE low power standards groups will discuss what they are doing in context of Low Power Workshop.

We then take a lunch break and celebrate 10 Years of SystemVerilog.  The first IEEE SystemVerilog standard (IEEE Std. 1800™-2005) was published in November 2005.  It seems fitting we celebrate this accomplishment.  Joining many of the participants in the IEEE SystemVerilog standardization effort for this celebration will be participants from the Accellera group that incubated it before it became an IEEE standard.  We won’t stop with just celebrating SystemVerilog.  We will also share information on standards projects that have leveraged SystemVerilog, like UVM, which has recently become a full fledged IEEE standards project (IEEE P1800.2).  With so many people who have worked on completed and successful IEEE standards, Accellera offered to bring its Portable Stimulus Working Group members over for a lunch break during their 3-day face-to-face Silicon Valley meeting to mingle with them, to learn from them and hopefully be inspired by them as well.  Maybe some of the success of building industry relevant standards can be shared between the SystemVerilog participants and Accellera’s newer teams.

We will then return to a focus on energy related issues with our first topic area being on power modeling for IP.  Chris Rowen, Cadence Fellow, will take us through some recent experiences on issues his teams have faced driving even higher levels of power efficiency from design using ever more design IP. Tails from the trenches never get old and offer us insight on what we might do in the development of better standards to help address those issues.  While Chris will point to a lot of issues when it comes to the use of design IP, I believe these issues are only compounded when it comes to the Internet of Things (IoT).  We have assembled a great afternoon panel to discuss if the “ultimate power challenge” is IoT.  I can’t wait to hear what they say.

Lastly, when we pull all these systems together, LSI package board issues pose a design interoperability challenge as well.  The IEEE Computer Society’s (CS) Design Automation Standards Committee (DASC) has completed another standard developed primarily outside of North America.  The DASC has a long history of global participation and significant standards development outside of North America, like is the case for VHDL AMS (IEEE 1076.1).  We will hear from the IEEE 2401™-2015 leadership on their newly minted IEEE standard and the LSI package board issues that have been addressed.

We don’t have time to highlight all the EDA & IP standards work in the IEEE, but our principle theme to address issues of power in modern design and verification led us to focus on a subset of them.  So, if your favorite standard or topic area does not appear in the program, let me know and we can add that to our list to consider next year.  And when I say “we,” the work to put together an event like this takes a lot of people. All of us are interested in what we should do for next year and what your input is to us.  For me, in addition to working to collect this, I also need to thank those who did all the work to make this happen.  I’ve often said, as chair, you let the others do all the work.  It has been great to collaborate with my IEEE-SA friends, my peers at the other two Big-3 EDA companies.  It has also been great to get the input and advice on the Steering Committee from two of the world’s largest silicon suppliers (Intel & TSMC) and to include for the first time, support from standards incubators Accellera Systems Initiative and Si2.

, , , , , , , , , , , , , , , ,

27 July, 2015

ASIC/IC Language and Library Adoption Trends

This blog is a continuation of a series of blogs related to the 2014 Wilson Research Group Functional Verification Study (click here).  In my previous blog (click here), I presented our study findings on various verification technology adoption trends. In this blog, I focus on language and library adoption trends.

As previously noted, the reason some of the results sum to more than 100 percent is that some projects are using multiple languages; thus, individual projects can have multiple answers.

Figure 1 shows the adoption trends for languages used to create RTL designs. Essentially, the adoption rates for all languages used to create RTL designs is projected to be either declining or flat over the next year, with the exception of SystemVerilog.


Figure 1. ASIC/IC Languages Used for RTL Design

Figure 2 shows the adoption trends for languages used to create ASIC/IC testbenches. Essentially, the adoption rates for all languages used to create testbenches are either declining or flat, with the exception of SystemVerilog. Nonetheless, the data suggest that SystemVerilog adoption is starting to saturate or level off at about 75 percent.


Figure 2. ASIC/IC Languages Used for  Verification (Testbenches)

Figure 3 shows the adoption trends for various ASIC/IC testbench methodologies built using class libraries.


Figure 3. ASIC/IC Methodologies and Testbench Base-Class Libraries

Here we see a decline in adoption of all methodologies and class libraries with the exception of Accellera’s UVM3, whose adoption increased by 56 percent between 2012 and 2014. Furthermore, our study revealed that UVM is projected to grow an additional 13 percent within the next year.

Figure 4 shows the ASIC/IC industry adoption trends for various assertion languages, and again, SystemVerilog Assertions seems to have saturated or leveled off.


Figure 4. ASIC/IC Assertion Language Adoption

In my next blog (click here) I plan to present the ASIC/IC design and verification power trends.

Quick links to the 2014 Wilson Research Group Study results

, , , , , , , , , , , , , , , ,

4 June, 2015

Learn more about DDA at DAC

At DAC – Mentor Graphics and Cadence Design Systems are coming together to usher in another level of productivity in verification results data access and portability with a modern design debug data application programming interface standard. We call this emerging standard the Debug Data API, or DDA for short.  We want to share more details with you in person at DAC.  Join us on Tuesday, June 9th, at the Verification Academy Booth (#2408) at 5:00pm for a joint presentation and unveiling.  And to get a bit of background and hint of what’s to come, please read on.

History: It Started with VCD

In the beginning we had VCD as the universal standard format to exchange simulation results as part of the IEEE 1364 (Verilog) standard.   Anyone trying to use VCD today on those large SoC’s or complex FPGA’s knows the size of VCD files has all but excluded this portion of the IEEE standard from use in modern design verification practice. So the question is when will it be replaced?

To ask that question today seems fine.  But I was even skeptical in the mid 1990’s when we at Mentor Graphics created Extended VCD to support the IEEE 1076.4 (VITAL) gate level simulation standard.  At that time the largest designs were around 1 million gates. While Extended VCD never became an official IEEE standard, we shared it with our ASIC Vendor and FPGA partners along with our major competitors to ensure debug data access and portability for VITAL users was on par with Verilog.  But Extended VCD also suffers the same fate of being almost impossible to support modern large designs.

Today: VCD Replaced by a Proprietary World

VCD and Extended VCD have remained static for about 20 years. But commercial simulator, emulator and other verification technology suppliers have not stopped innovating to advance support for larger design sizes with larger result data sets. As we move to 2 billion gate designs and beyond, the dependence on these private and closed technological advances and innovations has never been more important.

But that proprietary dependence comes with a cost. We stand at a crossroads where consumers of verification results information lose the open and unencumbered use offered by VCD or they need a path forward that preserves their current benefits while protecting and encouraging producers of such information to continue to innovate by private means.  The only alternative are fully integrated solutions from a single supplier that rarely get consumer endorsement in a best-of-breed; mix-and-match world.

Near Future: Federating the Proprietary World with DDA

Federating proprietary solutions almost sounds like something that is impossible to do. But Mentor and Cadence will share their emerging work on a standard to federate the different sources of verification results that can come from private sources with unencumbered access for the consumer. The Debug Data API standard will offer consumers the benefits of VCD interoperability, data portability and openness while preserving the benefits of private innovation for tool and solution producers. It will not impose data format translations from one format to another as a means to promote data portability.  It will not require the means by which one supplier or another stores verification results to be exposed.  It will offer the best of both worlds to producers and consumers.  I guess in some cases, you can have your cake and eat it too! There are more details to share, and the best place start is to meet us at DAC.

VA DDA Session AbstractMentor and Cadence Share DDA Details at DAC

  • Location: Verification Academy Booth (#2408)
  • Date: Tuesday, June 9th
  • Time: 5:00pm PT
    More Information >

We will discuss the details of DDA and show a proof of concept demonstration that will highlight each company’s simulator and results viewer in action.  Since there is no other session following this one at the Verification Academy booth, we will also be around to discuss the next steps with all present afterwards.

You can read more about this from my colleague and competitor, Cadence’s Adam Sherer, on his blog at the Cadence site here.  He bring his own perspective to this.

See you at DAC!

, , , , , , , , , , , ,

9 October, 2014

DVCon India, held in September 2014 in Bangalore, built on the Indian SystemC User Group meeting events and added a Design & Verification track to its popular system-level design (ESL) track that has been popular for many years.  The main stage played host to the keynote presentations, opening ceremonies and best paper and poster awards.

Several DVCon India keynote presentations, which I will go into more depth later touched on emerging use of virtual platforms in system design and the growing impact India has on design verification.  In particular, Mentor’s CEO, Wally Rhines contrasted Wilson Research survey data on design verification from India and the rest of the world.  A strong adoption of SystemVerilog and its popular methodology, the Universal Verification Methodology (UVM) was clear from the survey results Wally shared.

But even beyond SystemVerilog and UVM, the discuss of what could come next anchored the first day of DVCon India discussion on Accellera’s exploration of “portable stimulus.”  Accellera has a group exploring if the industry is ready to start a standards project on this concept.  And the first day when DVCon India attendees were offered an opportunity to learn about this, the multi-company (Mentor Graphics, Breker & CVC) tutorial on the topic was standing room only.

DVCon Europe – The Stage is Set!

A tutorial slot at DVCon Europe will be devoted to the same topic that was popular at DVCon India.  For DVCon Europe attendees, you will find Tutorial T9, “Creating Portable Tests with a Graph-Based Test Specification” will cover this topic.  Technical representatives from Mentor Graphics and Breker will cover aspects of portable stimulus and offer examples of how it can work.  And early application of the technology will be covered by a representative from IBM.  To cover the topic appropriately, we have modified the presenters listed in the official printed program and full details are available online.  The presenters will be, in this order:

  • Holger Horbach, IBM, Germany
  • Frederic Krampac, Breker, France
  • Staffan Berg, Mentor Graphics, Sweden

Please join us for this tutorial and ensuing conversation and discussion.  Verification productivity is a pressing issue and our ability to better control and create stimulus is a step in the direction to address the verification challenges we all face.

One last note, the concept of “portable stimulus” is language agnostic so no matter which language you use for design and verification, the intention is this technology will be able to help.   The tutorial will help you understand how using a graph-based approach enables the highest degree of verification re-use, from IP block to sub-system to full-system level verification. You will see how it supports verification in SystemVerilog, Verilog, VHDL, C, C/C++, assembly, and even other non-traditional base languages. And it also can be extended from simulation to emulation to FPGA prototyping, and even silicon validation.

I look forward to seeing you at DVCon Europe in Munich!  And if you have not yet registered, please do so to secure your seat.

, , , , , , , , , , ,

7 May, 2014

My Feb. 4 post introduced Mentor Graphics’ three-step FPGA verification process intended to help design teams get out of the reprogrammable lab more effectively. Since then, I’ve engaged FPGA vendors, design managers and engineers to explain the process, paying special attention to the merits and technical detail for injecting automation into any FPGA verification environment, the hallmark of Mentor’s process. The feedback from these conversations helped me to develop a series of technical webinars, now available for free and on-demand. Check them out and let us know what you think in the comments below. My hope is the webinars might serve as a starting point for your own conversations on verification of FPGAs, demand for which seems to continue to grow as process nodes shrink.

Injecting Automation into Verification – FPGA Market Trends

Injecting Automation into Verification – Code Coverage

Injecting Automation into Verification – Assertions

Injecting Automation into Verification – Improved Throughput

, , , , , , , , ,

4 February, 2014

Marketing teams at FPGA vendors have been busy as the silicon nanometer geometry race escalates. Altera is “delivering the unimaginable” while Xilinx is offering “all programmable SoCs” to design centers. It’s clear that the SoC has become more accessible to a broader market today and that FPGA vendors have staked out a solid technology roadmap for the near future. Do marketing messages surrounding the geometry race effect day to day life of engineers, and if so, how – especially when it comes to verification?
An excellent whitepaper from Altera, “The Breakthrough Advantage for FPGAs with Tri-Gate Technology,” covers Altera’s Stratix 10 FPGAs and SoCs. The paper describes verification challenges in this new expanded market this way: “Although current generation FPGAs require a rigorous simulation verification methodology rivaling ASICs, the additional lab testing and ability to reprogram FPGAs save substantial manpower investment. The overall cost of ownership must be considered when comparing an FPGA whose component price is higher than an ASIC of similar complexity.” I believe you can use this statement to engage your management in a discussion about better verification processes.

Xilinx also has excellent published technical resources. Its recent UltraScale backgrounder describes how they are solving the challenges in implementing a design with their reprogrammable silicon. Clearly Xilinx has made an impressive investment to make it easier to implement a design with its FPGA UltraScale products. Improvements include ASIC-like clocking and annealing dataflow bottlenecks without compromising performance. Xilinx also describes improvements when using its Vivado design suite, particularly when it comes to in-lab design bring up.

For other FPGA insights, it’s also worth checking out Electronics Engineering Journal’s recent article “Proliferating Programmability in 2014,” which claims that the long-term future of FPGAs tool flows even though, as Kevin Morris sees it, EDA seems to have abandoned the market. (Kevin, I’m here to tell you you’re wrong.)

Do you think it’s inevitable that your FPGA team will first struggle to make it across the verification finish line before adopting a more process-oriented verification flow like the ASIC market demands? It’s not. I base this conclusion on the many conversations I’ve had with FPGA designers, their managers, sales engineers and many other talented people in this market over the years. Yes, there are significant challenges in FPGA design, but not all of them are technology related. With some emotion, one engineer remarked that debugging the same type of issue over and over in the hardware lab and expecting a different outcome was insane. (He’s right.) Others say they need specific ROI information for their management to even accept their need for change. Still others state that had they only known the solutions I talked about in my seminar a year ago, they would have not spent months and months bringing up their design in the lab.
With my peers here at Mentor Graphics, I have developed a three-step verification flow that includes coverage, assertions and improved throughput. I’ll write about this flow and related issues in the weeks ahead here on this blog. The flow is built on fundamental verification technologies that benefit the broad FPGA market. The goal, in developing the technology and writing about it here, has been to provide practical solutions and help more FPGA teams cross the verification gap.

In the meantime, what are your stories? Are you able to influence your management into adopting advanced technology to aid lab bring-up? Is your management’s bias towards lower cost and faster implementation (at the expense of verification)? Let me know in the comments or, if you prefer, by e-mail:

, , , , , , , , ,

5 August, 2013

Language and Library Trends

This blog is a continuation of a series of blogs that present the highlights from the 2012 Wilson Research Group Functional Verification Study (for a background on the study, click here).

In my previous blog (Part 7 click here), I focused on some of the 2012 Wilson Research Group findings related to testbench characteristics and simulation strategies. In this blog, I present design and verification language trends, as identified by the Wilson Research Group study.

You might note that for some of the language and library data I present, the percentage sums to more than one hundred percent. The reason for this is that some participants’ projects use multiple languages.

RTL Design Languages

Let’s begin by examining the languages used for RTL design. Figure 1 shows the trends in terms of languages used for design, by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), the 2012 Wilson Research Group study (in green), as well as the projected design language adoption trends within the next twelve months (in purple) as identified by the study participants. Note that the design language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption continues to increase.

Also, it’s important to note that this study focused on languages used for RTL design. We have conducted a few informal studies related to languages used for architectural modeling—and it’s not too big of a surprise that we see increased adoption of C/C++ and SystemC in that space. However, since those studies have (thus far) been informal and not as rigorously executed as the Wilson Research Group study, I have decided to withhold that data until a more formal blind study can be executed related to architectural modeling and virtual prototyping.

Figure 1. Trends in languages used for Non-FPGA design

Let’s now look at the languages used specifically for FPGA RTL design. Figure 2 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 2. Languages used for Non-FPGA design

It’s not too big of a surprise that VHDL is the predominant language used for FPGA RTL design, although we are starting to see increased interest in SystemVerilog.

Verification Languages

Next, let’s look at the languages used to verify Non-FPGA designs (that is, languages used to create simulation testbenches). Figure 3 shows the trends in terms of languages used to create simulation testbenches by comparing the 2007 Far West Research study (in gray), the 2010 Wilson Research Group study (in blue), and the 2012 Wilson Research Group study (in green).

Figure 3. Trends in languages used in verification to create Non-FPGA simulation testbenches

The study revealed that verification language adoption is declining for most of the languages with the exception of SystemVerilog whose adoption is increasing. In fact, SystemVerilog adoption increased by 8.3 percent between 2010 and 2012.

Figure 4 provides a different analysis of the data by partitioning the projects by design size, and then calculating the adoption of SystemVerilog for creating testbenches by size. The design size partitions are represented as: less than 5M gates, 5M to 20M gates, and greater than 20M gates. Obviously, we find that the larger the design size, the greater the adoption of SystemVerilog for creating testbenches. Yet, probably the most interesting observation we can make from examining Figure 4 is related to smaller designs that are less than 5M gates. Here we see that 58.8 percent of the industry has adopted SystemVerilog for verification. In other words, it is safe to say that SystemVerilog for verification has become mainstream today and not just limited to early adopters or leading-edge design projects.

Figure 4. SystemVerilog (for verification) adoption by design size

Let’s now look at the languages used specifically for FPGA RTL design. Figure 5 shows the trends in terms of languages used for FPGA design, by comparing the 2012 Wilson Research Group study (in red) with the projected design language adoption trends within the next twelve months (in purple).

Figure 5. Trends in languages used in verification to create FPGA simulation testbenches

In my next blog (click here), I’ll continue the discussion on design and verification language trends as revealed by the 2012 Wilson Research Group Functional Verification Study.

, , , , , , , , , , , , , , , , , ,

23 April, 2013

This is the first in a series of blogs that presents the results from the 2012 Wilson Research Group Functional Verification Study.

Study Overview

In 2002 and 2004, Ron Collett International, Inc. conducted its well known ASIC/IC functional verification studies, which provided invaluable insight into the state of the electronic industry and its trends in design and verification. However, after the 2004 study, no other industry studies were conducted, which left a void in identifying industry trends.

To address this void, Mentor Graphics commissioned Far West Research to conduct an industry study on functional verification in the fall of 2007. Then in the fall of 2010, Mentor commissioned Wilson Research Group to conduct another functional verification study. Both of these studies were conducted as blind studies to avoid influencing the results. This means that the survey participants did not know that the study was commissioned by Mentor Graphics. In addition, to support trend analysis on the data, both studies followed the same format and questions (when possible) as the original 2002 and 2004 Collett studies.

In the fall of 2012, Mentor Graphics commissioned Wilson Research Group again to conduct a new functional verification study. This study was also a blind study and follows the same format as the Collett, Far West Research, and previous Wilson Research Group studies. The 2012 Wilson Research Group study is one of the largest functional verification studies ever conducted. The overall confidence level of the study was calculated to be 95% with a margin of error of 4.05%.

Unlike the previous Collett and Far West Research studies that were conducted only in North America, both the 2010 and 2012 Wilson Research Group studies were worldwide studies. The regions targeted were:

  • North America:Canada,United States
  • Europe/Israel:Finland,France,Germany,Israel,Italy,Sweden,UK
  • Asia (minusIndia):China,Korea,Japan,Taiwan
  • India

The survey results are compiled both globally and regionally for analysis.

Another difference between the Wilson Research Group and previous industry studies is that both of the Wilson Research Group studies also included FPGA projects. Hence for the first time, we are able to present some emerging trends in the FPGA functional verification space.

Figure 1 shows the percentage makeup of survey participants by their job description. The red bars represents the FPGA participants while the green bars represent the non-FPGA (i.e., IC/ASIC) participants.


Figure 1: Survey participants job title description

Figure 2 shows the percentage makeup of survey participants by company type. Again, the red bars represents the FPGA participants while the green bars represents the non-FPGA (i.e., IC/ASIC) participants.

Figure 2: Survey participants company description

In a future set of blogs, over the course of the next few months, I plan to present the highlights from the 2012 Wilson Research Group study along with my analysis, comments, and obviously, opinions. A few interesting observations emerged from the study, which include:

  1. FPGA projects are beginning to adopt advanced verification techniques due to increased design complexity.
  2. The effort spent on verification is increasing.
  3. The industry is converging on common processes driven by maturing industry standards.

A few final comments concerning the 2012 Wilson Research Group Study.  As I mentioned, the study was based on the original 2002 and 2004 Collett studies.  To ensure consistency in terms of proper interpretation (or potential error related to mis-interpretation of the questions), we have avoided changing or modifying the questions over the years—with the exception of questions that relate to shrinking geometries sizes and gate counts. One other exception relates  introducing a few new questions related to verification techniques that were not a major concern ten years ago (such as low-power functional verification).  Ensuring consistency in the line of questioning enables us to have high confidence in the trends that emerge over the years.

Also, the method in which the study pools was created follows the same process as the original Collett studies.  It is important to note that the data presented in this series of blogs does not represent trends related to silicon volume (that is, a few projects could dominate in terms of the volume of manufactured silicon and not represent the broader industry).  The data in this series of blogs represents trends related to the study pool—which is a fair proxy for active design projects.

My next blog presents current design trends that were identified by the survey. This will be followed by a set of blogs focused on the functional verification results.

Also, to learn more about the 2012 Wilson Reserach Group study, view my pre-recorded Functional Verification Study web-seminar, which is located out on the Verification Academy website.

Quick links to the 2012 Wilson Research Group Study results (so far…)

, , , , , , , , , , , , ,

24 January, 2013

VHDL-2008 Explained Via 7 Course Modules

For some time now a dedicated group of engineers have defined and standardized an important update to the VHDL standard.  Also know as IEEE Std. 1076™-2008, this update to VHDL took an interesting path to get to where it is today.  The VHDL standards team started the standards development work in the IEEE but sought additional input and standards project funding from industry.  Accellera provided a good venue in which to get industry input and feedback for an update to the VHDL standard along with funding.  Once industry input was taken into account, the proposed update to VHDL, approved by Accellera, was returned to the IEEE for official standards ratification and ongoing maintenance.  Jim Lewis, the IEEE VHDL Working Group Chair, points out this is the greatest update to VHDL since VHDL 93.  And I agree.  Jim is also the subject matter expert for the VHDL-2008 course modules on Verification Academy mentioned in this blog.

Verification Academy Modules

VHDL 2008In less than an hour and half – over 7 course modules – Jim will layout out the additions and changes to VHDL-2008 to simplify the language and extend it to address more of your pressing design and verification challenges with the addition of reusable data structures, simplified RTL coding and the inclusion of fixed and floating point math packages.

As part of my role in international standardization as co-convenor of the IEC TC 93 WG2 (now known as IEC TC 91 WG 13) and in keeping with the IEEE/IEC dual-logo agreement, I helped complete the dual logo process for this version of VHDL in 2011.  VHDL-2008 is also now known as IEC 61691-1-1-2011 – Behavioural languages – Part 1-1: VHDL Language Reference Manual.  I think we can all agree that that name is a bit much that we can simply call it VHDL-2008.

With this round of standardization complete, the VHDL-2008 course modules arrive just as complete support for VHDL-2008 emerges here at Mentor Graphics in our ModelSim and Questa products.

I encourage and invite VHDL users to get acquainted with VHDL-2008 via the seven course modules on Verification Academy.  Verification Academy “Full Access” membership is required.  And it is easy to sign up (certain restrictions apply).  For a quick look at what the courses offer, the introduction page found here will show you more details about the following modules.

“VHDL-2008 Why It Matters” Modules
  • VHDL 2008 Overview
  • RTL Enhancement
  • Package Type Enhancements
  • Floating Point Package
  • Testbench Enhancements
  • Operator Enhancements
  • Fixed Point Package

Additional Reference Material

There is additional reference material you may wish to have to get the most out of VHDL-2008.  Here is my short list:

  • IEEE Std. 1076-2008 Language Reference Manual (Click here)
  • VHDL-2008: Just the New Stuff (Click here)
  • The Designer’s Guide to VHDL, Third Edition (Click here)

, , ,

28 August, 2012

Five Leading Global Organizations Affirm “The Modern Paradigm for Standards”

open-standThe EDA industry has seen changes to the international standards paradigm the past few decades. When industry helped launch VHDL with the help of government support, it transferred ongoing maintenance and enhancement to the IEEE when it completed its first version. In addition to anchoring the standard at the IEEE, collaboration with the IEC for international standardization and recognition with the one-country, one-vote process set the stage for international approval of VHDL.

In the early days of Verilog, I encouraged similar support for that IEEE standard. But its support was not immediate and to some may have failed to track the pace of support by industry. Indeed, with Accellera developing SystemVerilog, later to become an IEEE standard and IEC standard, what was missing was the close link between a global industrial community and the international setting in which the standard was developed and deployed.

In the case of SystemVerilog, global markets drove the international deployment of the standard without respect to its formal status. Indeed, on what was called the “birthday” of SystemVerilog in Japan, the day it was approved as an official IEEE standard, the Japanese National Committee on standards hosted an open celebration that I was invited to attend. There was no waiting on their part the formal status. The interdependencies of global design, global commerce and global partnerships have driven all of us to adapt the standards development process for EDA.

On the eve of OpenStand’s launch, it is good to see the IEEE, Internet Society, W3C, IAB and IETF come together to put in writing what is the emerging practice for EDA.

You can learn more about the supporters of OpenStand, their guiding principles and how you can give your input, comments and feedback by visiting their website at And if you agree, you can even “stand” with them; with me; with us.

But in short, OpenStand promotes a standards development model that demands:

  • cooperation among standards organization;
  • adherence to due process, broad consensus, transparency, balance and openness in standards development;
  • commitment to technical merit, interoperability, competition, innovation and benefit to humanity;
  • availability of standards to all; and
  • voluntary adoption.

, , , , , , , , , ,

@dennisbrophy tweets

Follow dennisbrophy

@dave_59 tweets

Follow dave_59

@jhupcey tweets

Follow jhupcey