Archive for December, 2009
I’m excited. I’ve had the pleasure of knowing Cliff Cummings for many years, and I was honored a couple of years ago to have him write the foreword in a book that I published on assertions. Now, we have joined forces to do a set of seminars titled: “Assertion-Based Verification for FPGA and IC Design.” The first seminar will take place on January 19, 2010 in Santa Clara, CA, and you can register online by clicking here.
This six-hour seminar is organized into four sessions. My session is titled: “Industry Perspective and Opportunities in Assertion-Based Verification,” and I intend to provide a survey of today’s ABV landscape, ranging from various industry myths to realities. In fact, I specifically plan on addressing the issues raised in my recent blog titled “Evolution is a tinkerer.” In addition, I’ll talk about what characterizes a successful organization that has successfully adopted ABV, and then contrast it against organizations that are struggling or have failed in their attempt to integrate ABV into their flow.
The second and third sessions will focus on “Advanced Debugging with Assertions” and “Effective Coverage Using Assertions.”
We will conclude the seminar with an extended session covering “Basic SystemVerilog Assertions Training” by our SystemVerilog guru Cliff Cummings! His presentation details practical SystemVerilog assertion tricks that you can apply today to your own work, as well as methodological recommendations to improve efficiency when adopting ABV.
I hope everyone has a peaceful, happy holiday, and I look forward to meeting you on January 19 at the ABV seminar in Santa Clara, CA!
I see that Synopsys has finally released VMM1.2. Congratulations, guys. There will be plenty of opportunity over the coming weeks to discuss the relative merits of OVM vs. the OVM features that have been “borrowed” and jammed into this new version of VMM, (factory, phasing, hierarchy…) but I’d like to talk a bit in this post about Synopsys’ unique approach to version numbering.
Let me just say that it’s patently obvious that Synopsys chose to hide the fact that this is a dramatically different VMM by calling it version 1.2 instead of 2.0, which is what they should have called it. Even though the accepted practice in our industry is to increase the major version number for a change of this magnitude, Synopsys is trying to convince everyone that it’s an incremental change to the methodology.
The fact is that the biggest advantage VMM had over OVM was the fact that it had been around longer. OVM has the advantages of being more full-featured, flexible, modular and reusable. Once VMM users understand the extent to which they’re going to have to rewrite their existing VMM code to take advantage of the new 2.0 features, the continuity argument will be gone and they may as well take a look at OVM too. And new users will now choose between a stable, proven OVM and a brand-spankin’-new VMM. The tables have turned, and Synopsys doesn’t want you to know this.
In fact, we’ve already had one customer try and compile their existing VMM1.1 code against the VMM2.0 (I mean 1.2) library without success – not a good sign for backward compatibility. A quick look at the first three lines of the sv/std_lib/vmm.sv file shows why:
In other words, if you want to use your existing VMM1.1 code, you +define+VMM_11 to get the old library code, otherwise you get a completely different library! Tell me that’s not a major release!
Perhaps an alternate metric would be helpful. I have, sitting on my desk, a copy of the Verification Methodology Manual for SystemVerilog. It is 503 pages long. The VMM1.2 User Guide, which incorporates the original book, along with all the new 2.0 (rats, did it again) features, weighs in at a whopping 1408 pages! That’s nearly a 3x increase in material.
By contrast, the OVM User Guide is only 158 pages, so even when combined with the OVM Reference Guide (384 pages), you’ve still got nearly 2.5x more stuff to go through with VMM. We could even throw in The OVM Cookbook (235 pages) and OVM is still half the size of VMM2.0.
It will be up to you to decide whether to take a chance on the new VMM2.0 or go with the more stable OVM. By the way, don’t be surprised to see an OVM2.1 rather soon <!–[if gte mso 9]> Normal 0 false false false MicrosoftInternetExplorer4 <![endif]–><!–[if gte mso 9]> <![endif]–> soon that adds some new features to address user requests we’ve gotten. These new enhancements are completely backward-compatible with existing code, unlike VMM2.0.
Come to think of it, the only justification for calling VMM1.2 a minor release is that it doesn’t really advance the state of the art at all. Since all they’re doing is adding functionality to VMM that OVM has had for over a year, I guess it’s OK to call it VMM1.2 after all.
As they say, “Imitation is the sincerest form of flattery.”
Just in time for the holidays!
IEEE Std. 1800™-2009, aka SystemVerilog 2009, is ready for purchase and download from the IEEE. The standard was developed by the SystemVerilog Working Group and recently approved by the IEEE. It is an entity project of the IEEE jointly sponsored by the Corporate Advisory Group (CAG) and the Design Automation Standards Committee (DASC). The working group members represented Accellera, Sun Microsystems Inc, Mentor Graphics Corporation, Cadence Design Systems, Intel Corporation and Synopsys along with numerous other volunteers from around the world.
The publication of the standard culminates the work of representatives from the companies above along with numerous other interested parties and volunteers. Thank you to all who made this happen!
This standard represents a merger of two previous standards: the IEEE Std 1364-2005 Verilog Hardware Description Language (HDL) and the IEEE Std 1800-2005 SystemVerilog Unified Hardware Design, Specification, and Verification Language.
In these previous standards, Verilog was the base language and defined a completely self-contained standard. SystemVerilog defined a number of significant extensions to Verilog, but IEEE Std 1800-2005 was not a self-contained standard; IEEE Std 1800-2005 referred to, and
relied on, IEEE Std 1364-2005. These two standards were designed to be used as one language.
Merging the base Verilog language and the SystemVerilog extensions into a single standard enables users to have all information regarding syntax and semantics in a single document. The standard serves as a complete specification of the SystemVerilog language. The standard contains the following:
- The formal syntax and semantics of all SystemVerilog constructs
- Simulation system tasks and system functions, such as text output display commands
- Compiler directives, such as text substitution macros and simulation time scaling
- The Programming Language Interface (PLI) mechanism
- The formal syntax and semantics of the SystemVerilog Verification Procedural Interface (VPI
- An Application Programming Interface (API) for coverage access not included in VPI
- Direct programming interface (DPI) for interoperation with the C programming language
- VPI, API, and DPI header files
- Concurrent assertion formal semantics
- The formal syntax and semantics of standard delay format (SDF) constructs
- Informative usage examples
Where to Download & Purchase
For users who have access to IEEE Xplore, free downloads are available here.
For user who purchase single-copy need to visit Shop IEEE (here) and search for “1800″ to purchase. IEEE Member price is $260 and non-member price is $325.
Just wanted to let you know that the latest issue of Verification Horizons is now out. You can find it here. Aside from learning about the extent (or lack thereof) of my carpentry skills (read the Editor’s Letter to see what I’m talking about), you’ll also see these interesting articles:
Page 6… Evolving Your Organization’s ABV Capabilities
by Harry Foster, Chief Verification Scientist, DVT Mentor Graphics
Page 10… Advanced Static Verification is Indispensible
by Ping Yeung Ph.D., DVT Mentor Graphics
Page 14… Evolving the Coverage-Driven Verification Flow
by Matthew Ballance, Technical Marketing, SLE Division, Mentor Graphics
Page 17… Verification of an Ethernet PHY DUT Using Questa MVC
by Pankaj Goel, Mentor Graphics
Page 22… Breaking Your Own Code; A Design Engineer’s Perspective
on Verification Using Questa and OVM.
by Leeja Mathew, Mahasweta Das, Reshma Shetty; Silicon Interfaces
Page 26… SystemVerilog FrameWorks™ Scoreboard: An Open Source
Implementation Using OVM.
by Dr. Ambar Sarkar, Chief Verification Technologist, Paradigm Works Inc.
Page 31… SEmulation: Use Your Emulation Board as a Hardware Accelerator for ModelSim SE.
by Andreas Schwarztrauber, Gleichmann & Co. Electronics GmbH
Page 35… Coding Concise SystemVerilog Assertions.
by Clifford E. Cummings, Sunburst Design, Inc.
Page 41… Methodology for Board Level Functional Simulation and Hardware/Software Co-Verification Using Seamless.
by John Gryba, Senior Hardware Design Engineer, Alcatel-Lucent
Go check it out and let us know what you think.
I was recently quoted in an EDA DesignLine blog as saying that “it is a myth that ABV is a mainstream technology.” Actually, the original quote comes from an extended abstract I wrote for an invited tutorial at Computer-Aided Engineering (CAV) in 2008 titled Assertion-Based Verification: Industry Myths to Realities. My claim is based on the Farwest Research 2007 study (comissioned and sponsored by Mentor Graphics) that found approximately 37 percent of the industry had adopted simulation-based ABV techniques, and 19 percent of the industry had adopted formal ABV techniques. Now, those of you who know me know that I am an optimist—and therefore the statistics from these industry studies reveal a wonderful opportunity for design projects to improve themselves. However, the problem of adopting advanced functional verification techniques is not just limited to ABV. For example, the study revealed that only 48 percent of the industry performs code coverage. Let me repeat that, I said code coverage, not even something as exotic as functional coverage (which was observed to be about 40 percent of the industry)! Furthermore, only about 41 percent of the industry has adopted constrained random verification.
Now, we could argue about which is the best approach to measuring coverage or achieving functional closure, but that is not the point. The question is, how do we as an industry evolve our verification capabilities beyond 1990 best practices?
In my mind, one of the first steps in the evolutionary process is to define a model for assessing an organization’s existing verification capabilities. For a number of years I’ve studied variants of the Capability Maturity Model Integration (that is, CMM and CMMI) as a possible tool for assessment. After numerous discussions with many industry thought leaders and experts, I’ve concluded that the CMM is really not an ideal model for assessing hardware organizations. Nonetheless, there is certainly a lot we can learn from observing CMM applied on actual software projects.
For those of you unfamiliar with the CMM, its origins date back to the early 1980’s. During this period, the United States Department of Defense established the Software Engineering Institute at Carnegie Mellon University in response to a perceived software development crisis related to escalating development costs and quality problems. One of the key contributions resulting from this effort was the published work titled The Capability Maturity Model: Guidelines for Improving the Software Process. The CMM is a framework for assessing effective software process, which provides an evolutionary path for improving an organization’s processes from ad hoc, immature processes to developed, mature, disciplined ones.
Fundamental to maturing an organization’s process capabilities is an investment in developing skills within the organization. To assist in this effort, we have launched an ambitious project to evolve an organization’s advanced functional verification skills through the Verification Academy. In fact, we have an introductory module titled Evolving Capabilities that provides my first attempt at a simple assessment model. I anticipate that this model will itself evolve over time as I receive valuable feedback on refining and improving it. Nonetheless, the simple Evolving Capabilities model as it exists today provides a wonderful framework for organizing multiple modules focused on evolving an organization’s advanced functional verification capabilities.
I realize that evolving technical skills is obviously only part of the solution to successfully advancing the industry’s functional verification capabilities. Yet, education is an important step. I’d be interested in hearing your thoughts on the subject. Why do you think the industry as a whole has been slow in its adoption of advanced functional verification techniques? What can be done to improve this situation?
One element of my IEEE Standards Association (SA) volunteer activities is to represent the SA to the IEEE New Initiatives Committee (NIC). In December, the group gets together face-to-face to review projects underway and to approve new projects for the next year. The new initiative program is designed to identify potential new products, programs, and services that will provide significant benefit to IEEE members, customers, the technical community, the public, which promises to have a lasting impact on the IEEE.
Initiatives are of strategic importance and must demonstrate a clear benefit to IEEE. The process allows submission of a proposal at any time during the year, not just in December.
The IEEE SA worked with the NIC several years back to help it study and launch a Patent Pools project to expand possible IEEE standards business to facilitate faster adoption of standards where patent use is needed to implement a standard and a patent pool could help.
While the NIC has supported programs that facilitate standards use, new initiatives run the gamut of global IEEE interests. One project with several years of support that has had an impact on humanity is the IEEE Humanitarian Technology Network (HTN). A map of global projects is maintained that shows all going on around the world.
Any IEEE member, volunteer or IEEE organizational unit can submit a proposal. Do you have a new initiative idea for the IEEE? If so, you might want to submit your idea to the NIC for consideration.
IEEE new initiative proposals are submitted using one of two submission processes.
- The New Initiative Process (NIP) is for projects that require a minimum of US$100,000 funding for the first year.
Download IEEE New Initiatives Proposal Form (DOC, 229 KB)
- The Seed Grant Process (SGP) is for projects that are experimental and innovative and require US$25,000 of funding or less. Download IEEE New Initiatives Seed Grant Proposal Form (DOC, 71 KB)
Back to the Future; Unleash the Past
No, I’m not talking about the Michael J. Fox and Christopher Lloyd movie. Nor am I a talking about unleashing zombies from the Zombieland to make the present the dead past. I’ve given more thought to the DTC after reading some questions Brian Bailey asked in his “I don’t understand this new IEEE EDA User group.” His post that led me to dig deeper, ask questions and issue a Zombie Alert. (OK, the zombie alert is just humor.)
It appears that after more than a decade the Design Technology Council (DTC) has announced it has abandoned Si2 in favor of IEEE CEDA (Council on EDA) and it has changed its name to the Design Technology Committee. It keeps the DTC acronym, presumably in a move to save on a redo of letterhead. (OK, I can’t resist a bit of sarcasm.)
The announcement made me think we are either going back to the future or unleashing the dead past. Watch out – the zombies just might just be on the loose.
Given that CEDA is a collection of IEEE Societies that are replete with immense technical talent and brainpower to help address next generation design issues, the DTC may well be in a better home. They also intend to interact with other standards setting organization as well. For these reasons, I think think this is a great move!
Yet, when I read the whole of their press release again, I’m left with skepticism that was only fed by Brian’s words on the topic as well.
1. Captive EDA Represents themselves as the User and Commercial EDA as not?
Are they really users? The corporate affiliations of the DTC members is impressive! Their corporations are associated with some of the most advanced designs being done today. If anyone knows hard problems, they do. If anyone wants solutions, EDA should listen to them.
But when I read the titles of the current members I see a majority of them have CAD or EDA in them. Are they actually designers or Captive EDA representatives? Is Captive EDA any different from a Commercial EDA company? (I came from a Captive EDA group to Mentor Graphics, so I have some history here. Maybe that’s a topic of a future blog.)
Is this action to form inside CEDA being taken due to a lack of technology response by Commercial EDA or to represent Captive EDA self-interests? Can Captive EDA be seen as yet another middleman in a vendor/supplier relationship? Does that promote business efficiency of is it suboptimal?
Can a closed group like this that segregates Captive and Commercial EDA be seen as restraining and hampering trade? Or could it be that Captive EDA seeks to be a focal point for Commercial EDA business relationships to rationalize its existence? I do note that Gary Smith has presented his findings on a resurgence of Captive EDA. One can only conclude the resurgence is borne out of necessity and lack of Commercial EDA to address some pressing technical design challenges.
2. Something “NEW” was announced; but it is the same “OLD” thing
Can a group that announces it is new by concealing its past be that trusted? I’ve seen some blog comments that support a conclusion of confusion. Questions are asked is the DTC still in Si2? Does the industry need another DTC?
The truth is this group is has been around for a long time, not “newly formed” as they would suggest to all. The simple fact is the DTC moved. Concealing this information makes no sense to me. It just heightens my suspicion.
3. It is a CLOSED group
The press release says they “consists of leaders exclusively from semiconductor and systems companies who use EDA tools.” And they seek to expand as they say “nominations for new members are actively being solicited.”
Don’t get me wrong. I think it is great the DTC explores ways to rejuvenate itself and drive greater self relevance. It is good they have made a call for nominations for additional members. The DTC needs to expand their ranks as they are more cloistered in their configuration and out of touch with the current design practice in the world.
While no one group may be able to address all problems and solve all issues at once, a group that does not have representatives that play important roles in an age of design reuse where IP suppliers are important or when it does not represent the swelling ranks of programmable logic designers, it is not relevant to the majority of design practice today.
Since EDA vendors are not invited to participate, may I offer this thought through this venue: Don’t stay parochial. Don’t go back to the past. Recognize the future encompasses more than who you have been or are today. The challenges of a majority of designers globally should be part of your focus as you encourage interoperable design flows. To be the voice, you must be the body.
4. The DTC will communicate; but they will make it hard for anyone to hear what they say.
How can a group that is to be the voice of users be heard when they are closed?
If they want to be the user voice, what have they said the past decade? The voice of the user has been center to their theme since their inception under Si2. Yet there is no easily found record of the users’ voice being recorded. Even Google has a hard time to find any recording of this. It can find the promise to be the voice way back to 2001, but no recording of the actual voice.
Is a long standing promise unfulfilled just hallow when committed to today?
5. Are the proposed DTC business models anti-competitive?
The DTC says in their press release they will “communicate with each other … [on] business models through which [they] access design tools.” Yet such an action in the opposite direction (all EDA vendors getting together to say how they will sell) would be seen as anti-competitive and a restraint of trade.
Collective price negotiation is potentially anti-competitive if it results in the exercise of market power by buyers. Are the large EDA consumers banding together to exert monopsony power to create a buyer’s monopoly?
The DTC Vice-Chair, Thomas Harms, is looking to both grow DTC membership (users-only) and start a business model discussion on Twitter. Could those membership actions make the DTC a more powerful buyers collective? A conversation on business models has already been started on Twitter. Thomas and Cadence CMO have exchanged thoughts, as have many others. In one of Thomas’ tweets, it seems there is some thought that the world would be better if there was one of anything from EDA to bring to a halt the duplicating of products and the waste of R&D resources. Humm, are we soon to see one CPU architecture and one supplier, one semiconductor fabricator, one IP supplier, one bus interconnect model – one of everything?
Is IEEE CEDA’s goal to foster collective bargaining on the part of buyers? Can you guess if this is where I want my IEEE dues and profits from the Design Automation Conference being spent?
Go Forward to the Future. Unleash the Present. Don’t call all zombies from the past back to life. Become an active voice of the users. Share what you learn with all. From where I sit, we seek input from a wide body of users. We hope this group joins a larger community of users who seek more from EDA, give advice to EDA and in turn get more as we listen to you.
You have my ear….
What do you think?
As you may know, in addition to my duties here at Mentor, I’m also the General Chair of DVCon 2010. So it is with two hats on that I encourage you to check out the DVCon website.
With my “General Chair hat” on, I’d like to point out that we’ve actually expanded the conference to add an extra set of half-day tutorials on Monday afternoon, Feb. 22. That’s right! While other conferences are shrinking or disbanding altogether, DVCon is growing! You can see the full technical program on the DVCon site as well.
With my “Mentor hat” on, I draw your attention to Mentor’s half-day tutorial, “A Step-by-step Guide to Advanced Verification” on Tuesday afternoon, Feb. 23. In this tutorial, we’re going to walk you through all the steps of assembling a verification environment, from verification planning to partitioning and assembling your OVM environment at the block, subsystem and system levels. We’ll also show how to take advantage of our other advanced technologies within the OVM framework, like our inFact Intelligent Testbench Automation tool and Questa MVCs for protocol verification. We’ll also discuss low-power and clock-domain crossing verification, processor-based verification and emulation. This tutorial will walk through an actual example that you’ll be able to download and play with yourself after the conference.
So, don’t forget to register for DVCon. If you register before January 29, you’ll get a discount on your registration and also be eligible for the special $149 room rate at the DoubleTree.
I look forward to seeing you there!
While I have spent a decade sharing the developments of EDA standards with you monthly in the ModelSim Informant and encouraging partners to share their integrations success with you in our quarterly Verification Horizons publication, today we move sharing this information with you in a blog format.
When users were asked how they would like to get ongoing information from us, they said they like the blog format with Twitter updates. For about a year now I have engaged in Twitter dialog (Twitterlog?) It has brought many into conversations between our customers, partners and competitors that might only be observed at traditional conference and tradeshow events.
As developments on the standards front seem of importance to me, I will share that with you. These electronic missives may come with some frequency and at other times when there is little to say, I will be silent. (Time will see if there are periods of silence.)
The blogging is not moderated or subject to a Mentor blog board review and approval process to allow timely and efficient stream of consciousness discussion. Your comments are likewise allowed to be posted immediately too. (Yes, I will do my best to prohibit spam.) I look forward to the conversations we will have!
Social media is not new to most of us. But for those who may still not be familiar you may wish to learn how you can subscribe to my feed and that of Harry Foster, Tom Fitzpatrick or Dave Rich, whom I join on the Verification Horizons Blog. (Just Google RSS to learn how to automate getting our feeds.) And if you are a fan of the short message format, you might want to learn about Twitter. You can either follow me outright, (Twitter account required) or you can simply “Twitter-stalk” me (no account required).
Until next time,
IEEE Charles Proteus Steinmetz Award
I am honored to chair the IEEE Charles Proteus Steinmetz Award 2010 committee for selection the 2011 recipient. The IEEE issued a call for nominations in November 2009 that will close on 31 January 2010.
The committee seeks your nominations.
The IEEE Charles Proteus Steinmetz Award was established by the IEEE Board of Directors in 1979.
It may be presented each year to an individual for exceptional contributions to the development and/or advancement of standards in electrical and electronics engineering.
Recipient selection is administered through the Technical Field Awards Council of the IEEE Awards Board and I am please to be the committee’s chair. The award is presented to an individual only.
In the evaluation process, the following criteria are considered: engineering and administrative accomplishment and responsibilities, publications (books, standards, papers, conference); honors; supporting letters; IEEE Activities, other organizations, and the quality of the nomination.
The award consists of a bronze medal, certificate and honorarium.
Who was Charles Proteus Steinmetz?
He was the mathematician and electrical engineer that fostered the development of alternating current that made possible the expansion of the electric power industry in the United States.
In an era of multiple standards, it often takes many years for one to triumph over the other.
While many individuals who have continued to advance power and energy standards within the IEEE have been honored with this award, pioneers of new standards like 802, WiFi and others center to important IEEE standards that impact virtually all of humanity have also received this award.
The awards committee is looking for candidates to stand along these giants. Do you have someone to nominate?
About Verification Horizons BLOG
This blog will provide an online forum to provide weekly updates on concepts, values, standards, methodologies and examples to assist with the understanding of what advanced functional verification technologies can do and how to most effectively apply them. We're looking forward to your comments and suggestions on the posts to make this a useful tool.
- Part 1: The 2012 Wilson Research Group Functional Verification Study
- What’s the deal with those wire’s and reg’s in Verilog
- Getting AMP’ed Up on the IEEE Low-Power Standard
- Prologue: The 2012 Wilson Research Group Functional Verification Study
- Even More UVM Debug in Questa 10.2
- IEEE Approves New Low Power Standard
- May 2013 (2)
- April 2013 (2)
- March 2013 (2)
- February 2013 (5)
- January 2013 (1)
- December 2012 (1)
- November 2012 (1)
- October 2012 (4)
- September 2012 (1)
- August 2012 (1)
- July 2012 (6)
- June 2012 (1)
- May 2012 (3)
- March 2012 (1)
- February 2012 (6)
- January 2012 (2)
- December 2011 (2)
- November 2011 (2)
- October 2011 (3)
- September 2011 (1)
- July 2011 (3)
- June 2011 (6)
- Intelligent Testbench Automation Delivers 10X to 100X Faster Functional Verification
- Part 9: The 2010 Wilson Research Group Functional Verification Study
- Verification Horizons DAC Issue Now Available Online
- Accellera & OSCI Unite
- The IEEE’s Most Popular EDA Standards
- UVM Register Kit Available for OVM 2.1.2
- May 2011 (2)
- April 2011 (7)
- User-2-User’s Functional Verification Track
- Part 7: The 2010 Wilson Research Group Functional Verification Study
- Part 6: The 2010 Wilson Research Group Functional Verification Study
- SystemC Day 2011 Videos Available Now
- Part 5: The 2010 Wilson Research Group Functional Verification Study
- Part 4: The 2010 Wilson Research Group Functional Verification Study
- Part 3: The 2010 Wilson Research Group Functional Verification Study
- March 2011 (5)
- February 2011 (4)
- January 2011 (1)
- December 2010 (2)
- October 2010 (3)
- September 2010 (4)
- August 2010 (1)
- July 2010 (3)
- June 2010 (9)
- The reports of OVM’s death are greatly exaggerated (with apologies to Mark Twain)
- New Verification Academy Advanced OVM (&UVM) Module
- OVM/UVM @DAC: The Dog That Didn’t Bark
- DAC: Day 1; An Ode to an Old Friend
- UVM: Joint Statement Issued by Mentor, Cadence & Synopsys
- Static Verification
- OVM/UVM at DAC 2010
- DAC Panel: Bridging Pre-Silicon Verification and Post-Silicon Validation
- Accellera’s DAC Breakfast & Panel Discussion
- May 2010 (9)
- Easier UVM Testbench Construction – UVM Sequence Layering
- North American SystemC User Group (NASCUG) Meeting at DAC
- An Extension to UVM: The UVM Container
- UVM Register Package 2.0 Available for Download
- Accellera’s OVM: Omnimodus Verification Methodology
- High-Level Design Validation and Test (HLDVT) 2010
- New OVM Sequence Layering Package – For Easier Tests
- OVM 2.0 Register Package Released
- OVM Extensions for Testbench Reuse
- April 2010 (6)
- SystemC Day Videos from DVCon Available Now
- On Committees and Motivations
- The Final Signatures (the meeting during the meeting)
- UVM Adoption: Go Native-UVM or use OVM Compatibility Kit?
- UVM-EA (Early Adopter) Starter Kit Available for Download
- Accellera Adopts OVM 2.1.1 for its Universal Verification Methodology (UVM)
- March 2010 (4)
- February 2010 (5)
- January 2010 (5)
- December 2009 (15)
- A Cliffhanger ABV Seminar, Jan 19, Santa Clara, CA
- Truth in Labeling: VMM2.0
- IEEE Std. 1800™-2009 (SystemVerilog) Ready for Purchase & Download
- December Verification Horizons Issue Out
- Evolution is a tinkerer
- It Is Better to Give than It Is to Receive
- Zombie Alert! (Can the CEDA DTC “User Voice” Be Heard When They Won’t Let You Listen)
- DVCon is Just Around the Corner
- The “Standards Corner” Becomes a Blog
- I Am Honored to Honor
- IEEE Standards Association Awards Ceremony
- ABV and being from Missouri…
- Time hogs, blogs, and evolving underdogs…
- Full House – and this is no gamble!
- Welcome to the Verification Horizons Blog!
- September 2009 (2)
- July 2009 (1)
- May 2009 (1)