Archive for December, 2009

21 December, 2009

I’m excited. I’ve had the pleasure of knowing Cliff Cummings for many years, and I was honored a couple of years ago to have him write the foreword in a book that I published on assertions. Now, we have joined forces to do a set of seminars titled: “Assertion-Based Verification for FPGA and IC Design.”  The first seminar will take place on January 19, 2010 in Santa Clara, CA, and you can register online by clicking here.

This six-hour seminar is organized into four sessions. My session is titled: “Industry Perspective and Opportunities in Assertion-Based Verification,” and I intend to provide a survey of today’s ABV landscape, ranging from various industry myths to realities. In fact, I specifically plan on addressing the issues raised in my recent blog titled “Evolution is a tinkerer.” In addition, I’ll talk about what characterizes a successful organization that has successfully adopted ABV, and then contrast it against organizations that are struggling or have failed in their attempt to integrate ABV into their flow.

The second and third sessions will focus on “Advanced Debugging with Assertions” and “Effective Coverage Using Assertions.”

We will conclude the seminar with an extended session covering “Basic SystemVerilog Assertions Training” by our SystemVerilog guru Cliff Cummings! His presentation details practical SystemVerilog assertion tricks that you can apply today to your own work, as well as methodological recommendations to improve efficiency when adopting ABV.

I hope everyone has a peaceful, happy holiday, and I look forward to meeting you on January 19 at the ABV seminar in Santa Clara, CA!

, ,

18 December, 2009

I see that Synopsys has finally released VMM1.2. Congratulations, guys. There will be plenty of opportunity over the coming weeks to discuss the relative merits of OVM vs. the OVM features that have been “borrowed” and jammed into this new version of VMM, (factory, phasing, hierarchy…) but I’d like to talk a bit in this post about Synopsys’ unique approach to version numbering.

Let me just say that it’s patently obvious that Synopsys chose to hide the fact that this is a dramatically different VMM by calling it version 1.2 instead of 2.0, which is what they should have called it. Even though the accepted practice in our industry is to increase the major version number for a change of this magnitude, Synopsys is trying to convince everyone that it’s an incremental change to the methodology.

The fact is that the biggest advantage VMM had over OVM was the fact that it had been around longer. OVM has the advantages of being more full-featured, flexible, modular and reusable. Once VMM users understand the extent to which they’re going to have to rewrite their existing VMM code to take advantage of the new 2.0 features, the continuity argument will be gone and they may as well take a look at OVM too. And new users will now choose between a stable, proven OVM and a brand-spankin’-new VMM. The tables have turned, and Synopsys doesn’t want you to know this.

In fact, we’ve already had one customer try and compile their existing VMM1.1 code against the VMM2.0 (I mean 1.2) library without success – not a good sign for backward compatibility. A quick look at the first three lines of the sv/std_lib/vmm.sv file shows why:

`ifdef VMM_11
`include “vmm11/vmm.sv”
`else

In other words, if you want to use your existing VMM1.1 code, you +define+VMM_11 to get the old library code, otherwise you get a completely different library! Tell me that’s not a major release!

Perhaps an alternate metric would be helpful. I have, sitting on my desk, a copy of the Verification Methodology Manual for SystemVerilog. It is 503 pages long. The VMM1.2 User Guide, which incorporates the original book, along with all the new 2.0 (rats, did it again) features, weighs in at a whopping 1408 pages! That’s nearly a 3x increase in material.

By contrast, the OVM User Guide is only 158 pages, so even when combined with the OVM Reference Guide (384 pages), you’ve still got nearly 2.5x more stuff to go through with VMM. We could even throw in The OVM Cookbook (235 pages) and OVM is still half the size of VMM2.0.

It will be up to you to decide whether to take a chance on the new VMM2.0 or go with the more stable OVM. By the way, don’t be surprised to see an OVM2.1 rather soon <!–[if gte mso 9]> Normal 0 false false false MicrosoftInternetExplorer4 <![endif]–><!–[if gte mso 9]> <![endif]–> soon that adds some new features to address user requests we’ve gotten. These new enhancements are completely backward-compatible with existing code, unlike VMM2.0.

Come to think of it, the only justification for calling VMM1.2 a minor release is that it doesn’t really advance the state of the art at all. Since all they’re doing is adding functionality to VMM that OVM has had for over a year, I guess it’s OK to call it VMM1.2 after all.

As they say, “Imitation is the sincerest form of flattery.”

,

18 December, 2009

Just in time for the holidays!  :)

IEEE Std. 1800™-2009, aka SystemVerilog 2009, is ready for purchase and download from the IEEE.  The standard was developed by the SystemVerilog Working Group and recently approved by the IEEE.  It is an entity project of the IEEE jointly sponsored by the Corporate Advisory Group (CAG) and the Design Automation Standards Committee (DASC).  The working group members represented Accellera, Sun Microsystems Inc, Mentor Graphics Corporation, Cadence Design Systems, Intel Corporation and Synopsys along with numerous other volunteers from around the world.

IEEE Std. 1800-2009 LRM

IEEE Std. 1800-2009 LRM

The publication of the standard culminates the work of representatives from the companies above along with numerous other interested parties and volunteers.  Thank you to all who made this happen!

This standard represents a merger of two previous standards: the IEEE Std 1364-2005 Verilog Hardware Description Language (HDL) and the IEEE Std 1800-2005 SystemVerilog Unified Hardware Design, Specification, and Verification Language.

In these previous standards, Verilog was the base language and defined a completely self-contained standard. SystemVerilog defined a number of significant extensions to Verilog, but IEEE Std 1800-2005 was not a self-contained standard; IEEE Std 1800-2005 referred to, and
relied on, IEEE Std 1364-2005. These two standards were designed to be used as one language.

Merging the base Verilog language and the SystemVerilog extensions into a single standard enables users to have all information regarding syntax and semantics in a single document.  The standard serves as a complete specification of the SystemVerilog language. The standard contains the following:

  • The formal syntax and semantics of all SystemVerilog constructs
  • Simulation system tasks and system functions, such as text output display commands
  • Compiler directives, such as text substitution macros and simulation time scaling
  • The Programming Language Interface (PLI) mechanism
  • The formal syntax and semantics of the SystemVerilog Verification Procedural Interface (VPI
  • An Application Programming Interface (API) for coverage access not included in VPI
  • Direct programming interface (DPI) for interoperation with the C programming language
  • VPI, API, and DPI header files
  • Concurrent assertion formal semantics
  • The formal syntax and semantics of standard delay format (SDF) constructs
  • Informative usage examples
Where to Download & Purchase

For users who have access to IEEE Xplore, free downloads are available here.

For user who purchase single-copy need to visit Shop IEEE (here) and search for “1800″ to purchase.  IEEE Member price is $260 and non-member price is $325.

, , , , , , , , , , ,

17 December, 2009

Hi Everyone,

Just wanted to let you know that the latest issue of Verification Horizons is now out. You can find it here.  Aside from learning about the extent (or lack thereof) of my carpentry skills (read the Editor’s Letter to see what I’m talking about), you’ll also see these interesting articles:

Page 6… Evolving Your Organization’s ABV Capabilities
by Harry Foster, Chief Verification Scientist, DVT Mentor Graphics

Page 10… Advanced Static Verification is Indispensible
by Ping Yeung Ph.D., DVT Mentor Graphics

Page 14… Evolving the Coverage-Driven Verification Flow
by Matthew Ballance, Technical Marketing, SLE Division, Mentor Graphics

Page 17… Verification of an Ethernet PHY DUT Using Questa MVC
by Pankaj Goel, Mentor Graphics

Partners’ Corner:
Page 22… Breaking Your Own Code; A Design Engineer’s Perspective
on Verification Using Questa and OVM.

by Leeja Mathew, Mahasweta Das, Reshma Shetty; Silicon Interfaces

Page 26… SystemVerilog FrameWorks™ Scoreboard: An Open Source
Implementation Using OVM
.
by Dr. Ambar Sarkar, Chief Verification Technologist, Paradigm Works Inc.

Page 31… SEmulation: Use Your Emulation Board as a Hardware Accelerator for ModelSim SE.
by Andreas Schwarztrauber, Gleichmann & Co. Electronics GmbH

Page 35… Coding Concise SystemVerilog Assertions.
by Clifford E. Cummings, Sunburst Design, Inc.

Page 41… Methodology for Board Level Functional Simulation and Hardware/Software Co-Verification Using Seamless.
by John Gryba, Senior Hardware Design Engineer, Alcatel-Lucent

Go check it out and let us know what you think.

-Tom

14 December, 2009

I was recently quoted in an EDA DesignLine blog as saying that “it is a myth that ABV is a mainstream technology.” Actually, the original quote comes from an extended abstract I wrote for an invited tutorial at Computer-Aided Engineering (CAV) in 2008 titled Assertion-Based Verification: Industry Myths to Realities. My claim is based on the Farwest Research 2007 study (comissioned and sponsored by Mentor Graphics) that found approximately 37 percent of the industry had adopted simulation-based ABV techniques, and 19 percent of the industry had adopted formal ABV techniques. Now, those of you who know me know that I am an optimist—and therefore the statistics from these industry studies reveal a wonderful opportunity for design projects to improve themselves. ;-) However, the problem of adopting advanced functional verification techniques is not just limited to ABV.  For example, the study revealed that only 48 percent of the industry performs code coverage. Let me repeat that, I said code coverage, not even something as exotic as functional coverage (which was observed to be about 40 percent of the industry)! Furthermore, only about 41 percent of the industry has adopted constrained random verification. Advanced Functional Verification Adoption

Now, we could argue about which is the best approach to measuring coverage or achieving functional closure, but that is not the point. The question is, how do we as an industry evolve our verification capabilities beyond 1990 best practices?

In my mind, one of the first steps in the evolutionary process is to define a model for assessing an organization’s existing verification capabilities. For a number of years I’ve studied variants of the Capability Maturity Model Integration (that is, CMM and CMMI) as a possible tool for assessment. After numerous discussions with many industry thought leaders and experts, I’ve concluded that the CMM is really not an ideal model for assessing hardware organizations. Nonetheless, there is certainly a lot we can learn from observing CMM applied on actual software projects.

For those of you unfamiliar with the CMM, its origins date back to the early 1980’s. During this period, the United States Department of Defense established the Software Engineering Institute at Carnegie Mellon University in response to a perceived software development crisis related to escalating development costs and quality problems. One of the key contributions resulting from this effort was the published work titled The Capability Maturity Model: Guidelines for Improving the Software Process. The CMM is a framework for assessing effective software process, which provides an evolutionary path for improving an organization’s processes from ad hoc, immature processes to developed, mature, disciplined ones.

Fundamental to maturing an organization’s process capabilities is an investment in developing skills within the organization. To assist in this effort, we have launched an ambitious project to evolve an organization’s advanced functional verification skills through the Verification Academy. In fact, we have an introductory module titled Evolving Capabilities that provides my first attempt at a simple assessment model. I anticipate that this model will itself evolve over time as I receive valuable feedback on refining and improving it. Nonetheless, the simple Evolving Capabilities model as it exists today provides a wonderful framework for organizing multiple modules focused on evolving an organization’s advanced functional verification capabilities.

I realize that evolving technical skills is obviously only part of the solution to successfully advancing the industry’s functional verification capabilities. Yet, education is an important step. I’d be interested in hearing your thoughts on the subject. Why do you think the industry as a whole has been slow in its adoption of advanced functional verification techniques? What can be done to improve this situation?

,

14 December, 2009

One element of my IEEE Standards Association (SA) volunteer activities is to represent the SA to the IEEE New Initiatives Committee (NIC).  In December, the group gets together face-to-face to review projects underway and to approve new projects for the next year.  The new initiative program is designed to identify potential new products, programs, and services that will provide significant benefit to IEEE members, customers, the technical community, the public, which promises to have a lasting impact on the IEEE.

Initiatives are of strategic importance and must demonstrate a clear benefit to IEEE. The process allows submission of a proposal at any time during the year, not just in December.

The IEEE SA worked with the NIC several years back to help it study and launch a Patent Pools project to expand possible IEEE standards business to facilitate faster adoption of standards where patent use is needed to implement a standard and a patent pool could help.

While the NIC has supported programs that facilitate standards use, new initiatives run the gamut of global IEEE interests.  One project with several years of support that has had an impact on humanity is the IEEE Humanitarian Technology Network (HTN). A map of global projects is maintained that shows all going on around the world.

Any IEEE member, volunteer or IEEE organizational unit can submit a proposal.  Do you have a new initiative idea for the IEEE?  If so, you might want to submit your idea to the NIC for consideration.

Overview of the IEEE New Initiative Program

IEEE new initiative proposals are submitted using one of two submission processes.

,

10 December, 2009

Back to the Future; Unleash the Past

No, I’m not talking about the Michael J. Fox and Christopher Lloyd movie. Nor am I a talking about unleashing zombies from the Zombieland to make the present the dead past.  I’ve given more thought to the DTC after reading some questions Brian Bailey asked in his “I don’t understand this new IEEE EDA User group.” His post that led me to dig deeper, ask questions and issue a Zombie Alert.  (OK, the zombie alert is just humor.)

It appears that after more than a decade the Design Technology Council (DTC) has announced it has abandoned Si2 in favor of IEEE CEDA (Council on EDA) and it has changed its name to the Design Technology Committee.  It keeps the DTC acronym, presumably in a move to save on a redo of letterhead. (OK, I can’t resist a bit of sarcasm.)

The announcement made me think we are either going back to the future or unleashing the dead past.  Watch out – the zombies just might just be on the loose.

Given that CEDA is a collection of IEEE Societies that are replete with immense technical talent and brainpower to help address next generation design issues, the DTC may well be in a better home.  They also intend to interact with other standards setting organization as well.  For these reasons, I think think this is a great move!

Yet, when I read the whole of their press release again, I’m left with skepticism that was only fed by Brian’s words on the topic as well.

1.   Captive EDA Represents themselves as the User and Commercial EDA as not?
Are they really users?  The corporate affiliations of the DTC members is impressive! Their corporations are associated with some of the most advanced designs being done today. If anyone knows hard problems, they do.  If anyone wants solutions, EDA should listen to them.

But when I read the titles of the current members I see a majority of them have CAD or EDA in them.  Are they actually designers or Captive EDA representatives?  Is Captive EDA any different from a Commercial EDA company?  (I came from a Captive EDA group to Mentor Graphics, so I have some history here.  Maybe that’s a topic of a future blog.)

Is this action to form inside CEDA being taken due to a lack of technology response by Commercial EDA or to represent Captive EDA self-interests?  Can Captive EDA be seen as yet another middleman in a vendor/supplier relationship?  Does that promote business efficiency of is it suboptimal?

Can a closed group like this that segregates Captive and Commercial EDA be seen as restraining and hampering trade?  Or could it be that Captive EDA seeks to be a focal point for Commercial EDA business relationships to rationalize its existence?  I do note that Gary Smith has presented his findings on a resurgence of Captive EDA. One can only conclude the resurgence is borne out of necessity and lack of Commercial EDA to address some pressing technical design challenges.

2.   Something “NEW” was announced; but it is the same “OLD” thing
Can a group that announces it is new by concealing its past be that trusted?  I’ve seen some blog comments that support a conclusion of confusion.  Questions are asked is the DTC still in Si2?  Does the industry need another DTC?

The truth is this group is has been around for a long time, not “newly formed” as they would suggest to all. The simple fact is the DTC moved.  Concealing this information makes no sense to me.  It just heightens my suspicion.

3.   It is a CLOSED group
The press release says they “consists of leaders exclusively from semiconductor and systems companies who use EDA tools.”  And they seek to expand as they say “nominations for new members are actively being solicited.”

Don’t get me wrong. I think it is great the DTC explores ways to rejuvenate itself and drive greater self relevance. It is good they have made a call for nominations for additional members. The DTC needs to expand their ranks as they are more cloistered in their configuration and out of touch with the current design practice in the world.

While no one group may be able to address all problems and solve all issues at once, a group that does not have representatives that play important roles in an age of design reuse where IP suppliers are important or when it does not represent the swelling ranks of programmable logic designers, it is not relevant to the majority of design practice today.

Since EDA vendors are not invited to participate, may I offer this thought through this venue: Don’t stay parochial. Don’t go back to the past. Recognize the future encompasses more than who you have been or are today. The challenges of a majority of designers globally should be part of your focus as you encourage interoperable design flows.  To be the voice, you must be the body.

4.   The DTC will communicate; but they will make it hard for anyone to hear what they say.
How can a group that is to be the voice of users be heard when they are closed?

If they want to be the user voice, what have they said the past decade?  The voice of the user has been center to their theme since their inception under Si2.  Yet there is no easily found record of the users’ voice being recorded.  Even Google has a hard time to find any recording of this.  It can find the promise to be the voice way back to 2001, but no recording of the actual voice.

Is a long standing promise unfulfilled just hallow when committed to today?

5.   Are the proposed DTC business models anti-competitive?
The DTC says in their press release they will “communicate with each other … [on] business models through which [they] access design tools.”  Yet such an action in the opposite direction (all EDA vendors getting together to say how they will sell) would be seen as anti-competitive and a restraint of trade.

Collective price negotiation is potentially anti-competitive if it results in the exercise of market power by buyers.  Are the large EDA consumers banding together to exert monopsony power to create a buyer’s monopoly?

The DTC Vice-Chair, Thomas Harms, is looking to both grow DTC membership (users-only) and start a business model discussion on Twitter. Could those membership actions make the DTC a more powerful buyers collective? A conversation on business models has already been started on Twitter.  Thomas and Cadence CMO have exchanged thoughts, as have many others.  In one of Thomas’ tweets, it seems there is some thought that the world would be better if there was one of anything from EDA to bring to a halt the duplicating of products and the waste of R&D resources.  Humm, are we soon to see one CPU architecture and one supplier, one semiconductor fabricator, one IP supplier, one bus interconnect model – one  of everything?

HarmsTweet

Is IEEE CEDA’s goal to foster collective bargaining on the part of buyers?  Can you guess if this is where I want my IEEE dues and profits from the Design Automation Conference being spent?

My Hope
Go Forward to the Future. Unleash the Present. Don’t call all zombies from the past back to life.  Become an active voice of the users.  Share what you learn with all.  From where I sit, we seek input from a wide body of users.  We hope this group joins a larger community of users who seek more from EDA, give advice to EDA and in turn get more as we listen to you.

You have my ear….

What do you think?

,

9 December, 2009

DVCon Newsletter

Hi Gang,

As you may know, in addition to my duties here at Mentor, I’m also the General Chair of DVCon 2010. So it is with two hats on that I encourage you to check out the DVCon website.

With my “General Chair hat” on, I’d like to point out that we’ve actually expanded the conference to add an extra set of half-day tutorials on Monday afternoon, Feb. 22. That’s right! While other conferences are shrinking or disbanding altogether, DVCon is growing! You can see the full technical program on the DVCon site as well.

With my “Mentor hat” on, I draw your attention to Mentor’s half-day tutorial, “A Step-by-step Guide to Advanced Verification” on Tuesday afternoon, Feb. 23. In this tutorial, we’re going to walk you through all the steps of assembling a verification environment, from verification planning to partitioning and assembling your OVM environment at the block, subsystem and system levels. We’ll also show how to take advantage of our other advanced technologies within the OVM framework, like our inFact Intelligent Testbench Automation tool and Questa MVCs for protocol verification. We’ll also discuss low-power and clock-domain crossing verification, processor-based verification and emulation. This tutorial will walk through an actual example that you’ll be able to download and play with yourself after the conference.

So, don’t forget to register for DVCon. If you register before January 29, you’ll get a discount on your registration and also be eligible for the special $149 room rate at the DoubleTree.

I look forward to seeing you there!

, ,

9 December, 2009

While I have spent a decade sharing the developments of EDA standards with you monthly in the ModelSim Informant and encouraging partners to share their integrations success with you in our quarterly Verification Horizons publication, today we move sharing this information with  you in a blog format.

When users were asked how they would like to get ongoing information from us, they said they like the blog format with Twitter updates.  For about a year now I have engaged in Twitter dialog (Twitterlog?) It has brought many into conversations between our customers, partners and competitors that might only be observed at traditional conference and tradeshow events.

As developments on the standards front seem of importance to me, I will share that with you.  These electronic missives may come with some frequency and at other times when there is little to say, I will be silent.  (Time will see if there are periods of silence.)

The blogging is not moderated or subject to a Mentor blog board review and approval process to allow timely and efficient stream of consciousness discussion.   Your comments are likewise allowed to be posted immediately too. (Yes, I will do my best to prohibit spam.)  I look forward to the conversations we will have!

Social media is not new to most of us.  But for those who may still not be familiar you may wish to learn how you can subscribe to my feed and that of Harry Foster, Tom Fitzpatrick or Dave Rich, whom I join on the Verification Horizons Blog.  (Just Google RSS to learn how to automate getting our feeds.)  And if you are a fan of the short message format, you might want to learn about Twitter.  You can either follow me outright, (Twitter account required) or you can simply “Twitter-stalk” me (no account required).

Until next time,

Dennis


8 December, 2009

IEEE Charles Proteus Steinmetz Award

I am honored to chair the IEEE Charles Proteus Steinmetz Award 2010 committee for selection the 2011 recipient.  The IEEE issued a call for nominations in November 2009 that will close on 31 January 2010.

The committee seeks your nominations.

The IEEE Charles Proteus Steinmetz Award was established by the IEEE Board of Directors in 1979.

It may be presented each year to an individual for exceptional contributions to the development and/or advancement of standards in electrical and electronics engineering.

Recipient selection is administered through the Technical Field Awards Council of the IEEE Awards Board and I am please to be the committee’s chair.  The award is presented to an individual only.

In the evaluation process, the following criteria are considered: engineering and administrative accomplishment and responsibilities, publications (books, standards, papers, conference);  honors; supporting letters; IEEE Activities, other organizations, and the quality of the nomination.

The award consists of a bronze medal, certificate and honorarium.

Who was Charles Proteus Steinmetz?

He was the mathematician and electrical engineer that fostered the development of alternating current that made possible the expansion of the electric power industry in the United States.

As an interesting aside, while alternating current is commonplace, it was only a few years back on 14 November 2007 when the last direct current electricity service ceased operation.

In an era of multiple standards, it often takes many years for one to triumph over the other.

While many individuals who have continued to advance power and energy standards within the IEEE have been honored with this award, pioneers of new standards like 802, WiFi and others center to important IEEE standards that impact virtually all of humanity have also received this award.

The awards committee is looking for candidates to stand along these giants.  Do you have someone to nominate?

, ,

@dennisbrophy Tweets

  • Loading tweets...

@dave_59 Tweets

  • Loading tweets...