Posts Tagged ‘Low Power’

5 January, 2017

Face facts: power supply nets are now effectively functional nets, but they are typically not defined in the design’s RTL. But proper connection and behaviors of power nets and logic – power down, retention, recovery, etc. – must be verified like any other DUT element. As such, the question is how can D&V engineers link their testbench code to the IEEE 1801 Unified Power Format (UPF) files that describe the design’s low power structures and behaviors, so verification of all that low power “stuff” can be included in the verification plan?

A real power distribution setup

Fortunately, the answer is relatively straightforward.  In a nutshell, the top level UPF supply ports and supply nets provide hooks for the design, libraries, and annotated testbenches through the UPF connect_supply_net and connect_supply_set commands – these define the complete power network connectivity. Additionally, the top level UPF supply ports and supply nets are collectively known as supply pads or supply pins (e.g. VDD, VSS etc.), where the UPF low power standard recommends how supply pads may be referenced in the testbenches and extended to manipulate power network connectivity in a testbench simulation. Hence it becomes possible to control power ‘On’ and ‘Off’ for any power domain in the design through the supply pad referenced in the testbench.

All the necessary HDL testbench connections are done through importing UPF packages available under the power-aware simulation tool distribution environment. Even better: the IEEE 1801 LRM provides standard UPF packages for Verilog, SystemVerilog, and VHDL testbenches to import the appropriate UPF packages to manipulate the supply pads of the design under verification. The following are syntax examples for UPF packages to be imported or used in different HDL variants.
Example of UPF package setup for Verilog or SystemVerilog testbench

import UPF::*;
module testbench;
...
endmodule

Note: UPF packages can be imported within or outside of the module-endmodule declaration.

Example UPF package setup for a VHDL testbench

library ieee;
use ieee.UPF.all;

entity dut is
...
end entity;

architecture arch of dut is

begin
...
end arch;

The “import UPF::*” package and “use ieee.UPF.all;” library actually embeds the functions that are used to utilize and drive the design supply pads directly from the testbench. Thus, once these packages are referenced in the testbench, the simulator automatically searches for them from the simulator installation locations and makes the built-in functions of these packages available to utilize in the simulation environment. The following examples explain these functions, namely supply_on and supply_off with their detailed arguments.

Example functions for Verilog and SystemVerilog testbenches to drive supply pads

supply_on( string pad_name, real value = 1.0, string file_info = "");

supply_off( string pad_name, string file_info = "" );

Note: Questa Power Aware Simulator (PA SIM) users do not have to deal with the third argument, string file_info = “” – Questa will be automatically take care of this automatically.

Example functions for a VHDL testbench driving supply pads

supply_on ( pad_name : IN string ; value : IN real ) return boolean;

supply_off ( pad_name : IN string ) return boolean;

Regardless of the language used, the pad_name must be a string constant, and a valid top level UPF supply port must be passed to this argument along with a “non-zero” real value to denote power “On”, or “empty” to denote power “Off”. Questa PA-SIM will obtain the top module design name from the UPF set_scope commands defined below.

Now that the basic package binding and initial wiring is setup, how do you actually control the design supply pad through a testbench?  This is where the aforementioned UPF connect_supply_net or connect_supply_set and set_scope commands come in, as per the following code examples.

Example UPF with connect_supply_net for utilizing supply pads from the testbench

set_scope cpu_top
create_power_domain PD_top
......

# IMPLEMENTATION UPF Snippet
# Create top level power domain supply ports
create_supply_port VDD_A -domain PD_top
create_supply_port VDD_B -domain PD_top
create_supply_port VSS -domain PD_top

# Create supply nets
create_supply_net VDD_A -domain PD_top
create_supply_net VDD_B -domain PD_top
create_supply_net VSS -domain PD_top

# Connect top level power domain supply ports to supply nets
connect_supply_net VDD_A -ports VDD_A
connect_supply_net VDD_B -ports VDD_B
connect_supply_net VSS -ports VSS

Next, the UPF connect_supply_net specified supply ports VDD_A, VDD_B, VSS, etc. can be directly driven from the testbench as shown in the following code example.

import UPF::*;
module testbench;
...
reg VDD_A, VDD_B, VSS;
reg ISO_ctrl;
...
initial begin
#100
ISO_ctrl = 1’b1;
supply_on (VDD_A, 1.10); // Values represent voltage & non zero value

// (1.10) signifies Power On

supply_on (VSS, 0.0); // UPF LRM Specifies Ground VSS On at 0.0
...
#200
supply_on (VDD_B, 1.10);
...
#400
supply_off (VDD_A);   // empty real value argument indicates Power Off

...
end
endmodule

That’s all there is to it!

As you can glean from the examples, it is pretty easy to design a voltage regulator or a power management unit in the testbench through the functions supply_on and supply_off to mimic a real chip’s power operations. Of course there are many more functions available under these UPF packages, but hopefully this article is enough to get you started.

Joe Hupcey III
Progyna Khondkar
for the Questa Low Power Design & Verification product team

Related posts:

Part 11: The 2016 Wilson Research Group Functional Verification Study on ASIC/IC Low Power Trends

3 Things About UPF 3.0 You Need to Know Now

Whitepaper: Advanced Verification of Low Power Designs

, , , , , ,

21 November, 2016

ASIC/IC Power Trends

This blog is a continuation of a series of blogs related to the 2016 Wilson Research Group Functional Verification Study (click here).  In my previous blog (click here), I presented our study findings on various verification language and library adoption trends. In this blog, I focus on power trends.

Today, we see that about 72 percent of design projects actively manage power with a wide variety of techniques, ranging from simple clock-gating, to complex hypervisor/OS-controlled power management schemes (see Figure 1). This is essentially unchanged from our 2014 study.

BLOG-2016-WRG-figure-11-1

Figure 1. ASIC/IC projects working on designs that actively manage power

Figure 2 shows the various aspects of power-management that design projects must verify (for those 72 percent of design projects that actively manage power). The data from our study suggest that many projects are moving to more complex power-management schemes that involve software control. This adds a new layer of complexity to a project’s verification challenge, since these more complex power management schedules often require emulation to fully verify.

BLOG-2016-WRG-figure-11-2

Figure 2. Aspects of power-managed design that are verified

Since the power intent cannot be directly described in an RTL model, alternative supporting notations have recently emerged to capture the power intent. In the 2016 study, we wanted to get a sense of where the industry stands in adopting these various notations and what we found was essentially no change from our 2014 study. For projects that actively manage power, Figure 3 shows the various standards used to describe power intent that have been adopted. Some projects are actively using multiple standards (such as different versions of UPF or a combination of CPF and UPF). That’s why the adoption results do not sum to 100 percent.

BLOG-2016-WRG-figure-11-3

Figure 3. Notation used to describe power intent

In an earlier blog in this series, I provided data that suggest a significant amount of effort is being applied to ASIC/IC functional verification. An important question the various studies have tried to answer is whether this increasing effort is paying off. In my next blog (click here), I present verification results findings in terms of schedules, number of required spins, and classification of functional bugs.

Quick links to the 2016 Wilson Research Group Study results

, , , , , ,

6 September, 2016

2016_DVConIndia_WEBUPF logoUPF 3.0 has been an official IEEE standard since January, but its most valuable capabilities have only become clear as EDA vendors and users have begun to incorporate the corresponding design & verification (D&V) features. Looking across our user base, the following three items have come to the forefront of the UPF 3.0 adoption wave. Each of these topics will be covered in the upcoming DVCon India tutorial on “Advanced Validation and Functional Verification Techniques for Complex Low Power SoCs, but here is a sneak preview so you can be prepared to ask the presenters more in-depth questions.

1 – Power Intent Abstraction enabled by Successive Refinement: this UPF 3.0-enabled methodology eliminates one of the biggest unintentional limitations of prior UPF versions — in the past designers were stuck defining their power intent in very specific detail, and without the ability to perform hierarchical composition of power states nor refinement of power states of an IP to fit within the SoC context. While this is OK if you are working at the implementation level, it was impossible to effectively abstract the power intent to higher levels to facilitate early and comprehensive verification. Specifically, this meant that IP creators and users had to do a lot of editing & scripting to give some semblance of flexibility to their D&V flows, and verification often had to wait till all the implementation details were known. Now, with UPF 3.0, an IP provider can capture the low power constraints for an IP block without limiting the IP consumers and back-end implementers to any particular configuration. As long as the constraints are met, the IP user can configure the IP for their particular application with ease.

2 – Automation of Power-Aware Static & Dynamic Checking: the “completeness” possible with a UPF 3.0 power intent specification, combined with the DUT’s RTL (and attributes from Liberty, if so desired) enables a unified view of the design’s power management architecture. In turn, this enables static checking of the power architecture, dynamic power sequence checking, as well as power-aware clock domain crossing (CDC) analysis. In plain English, EDA tools can now automatically find power management issues from the basic (e.g., Are the isolation cells inserted where required by power state definitions?) to the pretty-much-impossible-to-do-by-hand, exhaustive, power-aware CDC analysis for metastability risks when different asynchronous clocks crisscross different power domains.

3 – Automation of Power-Aware Functional Coverage: another valuable benefit of UPF 3.0 is the way it enables automatic power state coverage. From power state entry, to exit, through all the legal transitions, to cross-state coverage, the verification of all of these behaviors can be automated thanks to parsing of the “add_power_states” spec created by the designer. Plus, debug and analysis is no longer a matter reading tea leaves from a jumble of waveforms and log files. Instead, graphical views of circuit activity over time can clearly show the flow of power states as each DUT use case is verified, with all the results being stored in an IEEE standard Unified Coverage Database (UCDB) file for export to verification planning and management tools.

While there is much more to UPF 3.0 than these three items, it bears repeating that these have been in the vanguard of UPF 3.0 adoption by the companies on the cutting edge of low power D&V. Fortunately, the upcoming DVCon India tutorial will provide much greater detail on these flows and more so you can learn how this new generation of the UPF standard can help you meet your project’s goals faster and more effectively.

Until then, may your coverage be high and your power consumption be low,

Joe Hupcey III

, , , ,

18 August, 2016

A great technical program awaits you for DVCon India 2016!  The DVCon India Steering Committee and Technical Program Committee have put together another outstanding program.  The two-day event splits itself into two main technical tracks: one for the Design Verification professional [DV Track] and the other from the Electronic System Design professional [ESL Track].  The conference will be held on Thursday & Friday, 15-16 September 2016 at the Leela Palace in Bangalore.  The conference opens with industry keynotes and a round of technical tutorials the first day.  Wally Rhines, Mentor Graphics CEO, will be the first keynote of the morning on “Design Verification – Challenging Yesterday, Today and Tomorrow.”

Mentor Graphics at DVCon India

In addition to Wally’s keynote, Mentor Graphics has sponsored several tutorials which when combined with other conference tutorials shares information, techniques and tips-and-tricks that can be applied to your current design and verification challenges.

The conference’s other technical elements (Posters, Panels & Papers) will likewise feature Mentor Graphics participants.  You should visit the DVCon India website for the full details on the comprehensive and deep program that has been put together.  The breadth of topics makes it an outstanding program.

Accellera Portable Stimulus Standard (PSS)

The hit of the first DVCon India was the early discussion about the emerging standardization activity in Accellera on “Portable Stimulus.”  In fact, at the second DVCon India a follow-up presentation on PSS standardization was requested and given as well (Leveraging Portable Stimulus Across Domains and Disciplines).  This year will be no exception to cover the PSS topic.

The Accellera Tutorial for DVCon India 2016 is on the emerging Portable Stimulus Standard.  The last thing any design and verification team wants to do is to rewrite a test as a design progresses along a path from concept to silicon.  The Accellera PSS tutorial will share with you concepts being ratified in the standard to bring the next generation of verification productivity and efficiency to you to avoid this.  Don’t be surprised if the PSS tutorial is standing room only.  I suggest if you want a seat, you come early to the room.

Register

To attend DVCon India, you must register.  A discounted registration rates available through 30 August 2016.  Click here to register!  I look forward to see you at DVCon India 2016! If you can’t join us in person, track the Mentor team on social media or on Twitter with hashtag #DVCon.

, , , , , , , ,

8 August, 2016

This is the first in a series of blogs that presents the findings from our new 2016 Wilson Research Group Functional Verification Study. Similar to my previous 2014 Wilson Research Group functional verification study blogs, I plan to begin this set of blogs with an exclusive focus on FPGA trends. Why? For the following reasons:

  1. Some of the more interesting trends in our 2016 study are related to FPGA designs. The 2016 ASIC/IC functional verification trends are overall fairly flat, which is another indication of a mature market.
  2. Unlike the traditional ASIC/IC market, there has historically been very few studies published on FPGA functional verification trends. We started studying the FPGA market segment back in the 2010 study, and we now have collected sufficient data to confidently present industry trends related to this market segment.
  3. Today’s FPGA designs have grown in complexity—and many now resemble complete systems. The task of verifying SoC-class designs is daunting, which has forced many FPGA projects to mature their verification process due to rising complexity. The FPGA-focused data I present in this set of blogs will support this claim.

My plan is to release the ASIC/IC functional verification trends through a set of blogs after I finish presenting the FPGA trends.

Introduction

In 2002 and 2004, Collett International Research, Inc. conducted its well-known ASIC/IC functional verification studies, which provided invaluable insight into the state of the electronic industry and its trends in design and verification at that point in time. However, after the 2004 study, no additional Collett studies were conducted, which left a void in identifying industry trends. To address this dearth of knowledge, five functional verification focused studies were commissioned by Mentor Graphics in 2007, 2010, 2012, 2014, and 2016. These were world-wide, double-blind, functional verification studies, covering all electronic industry market segments. To our knowledge, the 2014 and 2016 studies are two of the largest functional verification study ever conducted. This set of blogs presents the findings from our 2016 study and provides invaluable insight into the state of the electronic industry today in terms of both design and verification trends.

Study Background

Our study was modeled after the original 2002 and 2004 Collett International Research, Inc. studies. In other words, we endeavored to preserve the original wording of the Collett questions whenever possible to facilitate trend analysis. To ensure anonymity, we commissioned Wilson Research Group to execute our study. The purpose of preserving anonymity was to prevent biasing the participants’ responses. Furthermore, to ensure that our study would be executed as a double-blind study, the compilation and analysis of the results did not take into account the identity of the participants.

For the purpose of our study we used a multiple sampling frame approach that was constructed from eight independent lists that we acquired. This enabled us to cover all regions of the world—as well as cover all relevant electronic industry market segments. It is important to note that we decided not to include our own account team’s customer list in the sampling frame. This was done in a deliberate attempt to prevent biasing the final results. My next blog in this series will discuss other potential bias concerns when conducting a large industry study and describe what we did to address these concerns.

After data cleaning the results to remove inconsistent or random responses (e.g., someone who only answered “a” on all questions), the final sample size consisted of 1703 eligible participants (i.e., n=1703). This was approximately 90% this size of our 2014 study (i.e., 2014 n=1886). However, to put this figure in perspective, the famous 2004 Ron Collett International study sample size consisted of 201 eligible participants.

Unlike the 2002 and 2004 Collett IC/ASIC functional verification studies, which focused only on the ASIC/IC market segment, our studies were expanded in 2010 to include the FPGA market segment. We have partitioned the analysis of these two different market segments separately, to provide a clear focus on each. One other difference between our studies and the Collett studies is that our study covered all regions of the world, while the original Collett studies were conducted only in North America (US and Canada). We have the ability to compile the results both globally and regionally, but for the purpose of this set of blogs I am presenting only the globally compiled results.

Confidence Interval

All surveys are subject to sampling errors. To quantify this error in probabilistic terms, we calculate a confidence interval. For example, we determined the “overall” margin of error for our study to be ±2.36% at a 95% confidence interval. In other words, this confidence interval tells us that if we were to take repeated samples of size n=1703 from a population, 95% of the samples would fall inside our margin of error ±2.36%, and only 5% of the samples would fall outside. Obviously, response rate per individual question will impact the margin of error. However, all data presented in this blog has a margin of error of less than ±5%, unless otherwise noted.

Study Participants

This section provides background on the makeup of the study.

Figure 1 shows the percentage of overall study FPGA and ASIC/IC participants by market segment. It is important to note that this figures does not represent silicon volume by market segment.

BLOG-2016-WRG-figure-0-1

Figure 1: FPGA and ASIC/IC study participants by market segment

Figure 2 shows the percentage of overall study eligible FPGA and ASIC/IC participants by their job description. An example of eligible participant would be a self-identified design or verification engineer, or engineering manager, who is actively working within the electronics industry. Overall, design and verification engineers accounted for 60 percent of the study participants.

BLOG-2016-WRG-figure-0-2

Figure 2: FPGA and ASIC/IC study participants job title description

Before I start presenting the findings from our 2016 functional verification study, I plan to discuss in my next blog (click here) general bias concerns associated with all survey-based studies—and what we did to minimize these concerns.

Quick links to the 2016 Wilson Research Group Study results

, , , , , , , , , , , , , , , , , ,

8 February, 2016

Join Us at DVCon

As an annual conference, DVCon has set itself apart from others.  With a high focus on the application of design and verification tools and technology the venue is a prime location to exchange best practices and learn about emerging and current standards for the practicing engineer.  DVCon has also gone global to promote locally the sharing of best practices and building a wider global audience.  The flag ship event, DVCon U.S. has grown into an event that brackets two days of paper and poster sessions with tutorials on sessions on emerging standards from Accellera and how-to practical information by producers and users of design automations technology.  Building knowledge, skill and proficiency that can be applied in one’s design and verification engineering profession is unique.  What can you learn?  How can you share?

The answer to those questions has one simple answer: Attend DVCon U.S.  What specifically might apply to your current engineering demands are best found examining the conference program.  DVCon U.S. runs Monday – Thursday (29 February – 3 March 2016).  Monday is “Accellera Day” and features a focus on emerging and popular standards.  Content is geared for both beginners and advanced users.  Tuesday and Wednesday will feature papers, panels, posters and keynotes.  The topics will take you from system level to gates and from design to verification and hardware to software and portable stimulus to low power design.  These two days are organized as more of a traditional technical conference with parallel track, complemented by sponsored lunches and afternoon/evening exhibition for tool and services suppliers to share their latest product offerings.  Wednesday concludes with announcing the best paper and poster awards.  But that is not the end of the conference.  Thursday is “Tutorial Day.”  Four parallel half-day tutorials will be presented on a variety of topics.  The Mentor Graphics team has sponsored two of the tutorials, one in the morning and one in the afternoon:

  • Tutorial 5: Advanced Validation and Functional Verification Techniques for Complex Low Power System-on-Chips
  • Tutorial 9: Back to Basics: Doing Formal the Right Way

Low Power Tutorial: New IEEE 1801 Standard

In this blog, I would like to focus in on Tutorial 5 as it relates to one of the most daunting challenges today: low power design of SoC’s.  And, this tutorial will explore how new constructs in IEEE Std. 1801™-2015 (UPF 3.0) can facilitate power modeling at high levels of abstraction and improve application of Successive Refinement methodology.  At the publication of this blog, the IEEE has not yet published this new standard.  It was approved at the IEEE Standards Association meeting series in December 2015.  If you were a member of the IEEE ballot group or a member of the IEEE 1801 Working Group, you have a copy of the last draft of the standard and have access to all the information you might need on the new constructs that were added.  For everyone else, my recommendation is to attend DVCon U.S. and this tutorial to learn more in advance of the publication from the experts who created the standard and the Successive Refinement methodology.

You can find full information about DVCon U.S. here and to join us, register here.  And if your time is really limited and you can’t make the full conference, the Exhibition runs into the evening (Monday – Wednesday) for those who are local and might want to visit after work.  Even better news, the Exhibits-only pass is fee free.  See you there!

, , , , ,

19 January, 2016

[SPOILER ALERT] I suspect virtually all Verification Horizons blog readers have seen Star Wars: The Force Awakens by now, but be advised that this post has a minor spoiler.

R2-D2 force awakens image

In the latest installment of the venerable Star Wars series, upon command from another droid the famous R2-D2 comes out of a multi-year hibernation to share some important data. Hence, R2-D2’s designers clearly needed to employ similar low power design and verification techniques faced by systems and SoC designers here in our own galaxy. Indeed, in the Internet of Things (IoT) realm, there is a growing category of devices that are specified to last for at least 10 years without an external, wired power source or recharge (e.g. remote or hard-to-reach sensors). Hence, it is instructive to consider the challenges behind creating an astromech droid that can remain dormant for 15 or more years until the specified reactivation signal is detected.

* First, given the wide range of a droid’s operating environments, R2’s designers could not assume that the droid would have access to a power source after going into “dormant mode”.

* This requirement drives the selection of an energy supply that decays very slowly over a long time so when the system comes out of dormancy there will be enough power to boot back up and resume operation. Spacecraft designers in our galaxy favor radioisotope thermoelectric generators, and surprisingly “In the past, small ‘plutonium cells’ (very small 238Pu-powered RTGs) were used in implanted heart pacemakers to ensure a very long ‘battery life’. As of 2004, about 90 were still in use.” Hence, it is plausible to assume that R2-D2 is powered by something similar.

* To minimize the power consumed during dormancy, one can further assume R2’s circuits use power shutoff techniques familiar to designers here on Earth. Indeed, in order to endure a really, really long dormancy period the droid must completely power down. Consequently, it is logical to surmise that the “front end” of R2-D2’s wireless interface is some sort of energy harvesting RFID-like receiver — when an authorized person/droid approaches R2-D2, he/she/it could send an energetic activation pulse that would literally-jump start this circuit, which would in-turn activate the larger system.

R2-D2 specs

R2-D2’s basic specs and accessories

* In addition to supporting long term dormancy, one can imagine R2-D2’s SoCs employ several other low power design techniques to support the wide range of R2’s accessories and sensors when the droid is in regular use. For example, R2 certainly must employ multiple supply voltages for its higher power features such as the head-rotation and wheel motors, holo projector, and its various manipulator arms. Other methods like clock gating and dynamic voltage frequency scaling surely come into play during “snooze” modes of operation when transiting from base to the battle space, with the system becoming completely awake when the host spacecraft’s pilot begins to approach the target area.

* Finally, verifying the low power behaviors of such a complex system would be substantially simplified by using the “successive refinement” methodology, with the low power architecture captured in the intergalactic IEEE 1801-2015 (a/k/a UPF) standard. Functional verification would proceed using testbenches that follow the Universal Verification Methodology (UVM) driving numerous power-aware simulations, which in turn would be orchestrated by sophisticated verification management tools. These simulations would be complemented by exhaustive formal analysis of critical IPs and sub-systems, with the functional coverage results from simulation and formal merged together into a single, industry-standard Unified Coverage DataBase (UCDB) by the aforementioned verification management tool. Additionally, R2-D2 must employ numerous independent clock domains for all of its internal circuitry and accessories. Hence, undoubtedly power-aware clock-domain crossing verification (CDC) was a design sign-off criteria for R2’s designers.

Do you agree?  What other low power design & verification techniques are required for such long lived systems here on Earth, in our solar system, or in a galaxy far, far away? Please share your thoughts in the comments below, or contact me offline.

Until next time, may the force be with you!

Joe Hupcey III
for the Questa Low Power Design & Verification product team


7 January, 2016

First, if you were brought here by a desperate Google search for “timing closure tricks STA RTL” as your tape out deadline looms, I can empathize — working to achieve timing closure on a deadline is truly a high pressure, thankless job. Even worse from this standpoint, the addition of vital low power design techniques could unintentionally be the reason why your cycle of analyses and fixes are still not converging — let me explain …

A staple of low power design is to break up a circuit into multiple power domains so dynamic power consumption can be managed, and ultimately reduced. As such, on-command these domains can be electrically isolated from each other so they can be safely shutoff, or run at a lower voltage, or even run at a reduced clock frequency. However, this partitioning – as described by a UPF file and rendered over your design’s RTL – can wreak havoc with your clock and reset signaling networks. Specifically, when clock and reset signals cross power domains they are not synchronous anymore because of the level shifters or isolation cells that have been inserted at the domain interfaces. As such, this can unintentionally give rise to chip killing clock domain crossing (CDC) bugs that would not have otherwise existed.

isolation cell introduces a CDC path between synchronous registers

Example of how an isolation cell can introduce a CDC path between synchronous registers

Additionally, the low power control signals are often aligned to an independent clock that’s part of the low power controller IP – a clock that’s independent of the other clocks in the design. Again, this is a recipe for CDC disaster whether you have a small IoT-related design or a SoC with 100’s of IPs.

Finally, CDC errors often look like static timing problems (if you are not already doing a CDC analysis) – assuming you are lucky enough to see hints of these issues with gate-level simulations before a silicon prototype is minted.

Now for some good news: there are “power aware CDC” methodologies and solutions that can get you out of this mess, such as the “successive refinement” features in the IEEE 1801 low power standard. In a nutshell, these capabilities and the related methodology allows designers to begin the design and verification of power distribution networks at the beginning of the design cycle; and then continue to refine the power networks throughout the project as the circuit implementation comes to life. Alternatively, as in the “tapeout deadline is looming” case, if you are late in the design and implementation cycle it suggests how to execute a “power-aware” CDC analysis. Specifically, designers can run CDC verification for the power distribution networks at the RTL level, and thus avoid dealing with CDC errors at the gate-level, where they are a real pain to analyze. You can find a full article on this topic by CDC and low power methodology expert Kurt Takara at https://goo.gl/dX8Teb

I trust this information will help you get to the real source of your errors, and help you reach your verification goals faster!

Until next time, may all your clock domains be synchronized, and your reset signaling be properly buffered,

Joe Hupcey III
on behalf of the Questa Formal and CDC team

P.S. Kurt’s paper was originally presented at DVCon 2015, March 2015, in San Jose.  If you want to get a feel for it, here is a 4 minute video of Kurt walking through the highlights:  https://youtu.be/VCyhma3DK4g

, ,

7 October, 2015

ieee-sa-logo2Design and verification flows are multifaceted and predominantly built by bringing tools and technology together from multiple sources.   The tools from these sources build upon IEEE standards – several IEEE standards.  What started with VHDL (IEEE 1076™) and Verilog/SystemVerilog (IEEE 1800™) and their documented interfaces has grown.  As more IEEE standards emerged and tools and technology combined these standards in innovative and differentiated ways the industry would benefit from an ongoing open and public discussion on interoperability.  The IEEE Standards Association (IEEE-SA) continues with this tradition started by my friends at Synopsys with the IEEE-SA EDA & IP Interoperability Symposium.  And for 2015, I’m pleased to chair the event.

Anyone working on or using design and verification flows that depend on tool interoperability as well as design and verification intellectual property (IP) working together will benefit from attending this symposium.  The symposium will be held Wednesday, 14 October 2015, at the offices of Cadence Design Systems in San Jose, CA USA.  You can find more information about the event at the links below:

  • Register: Click here.
  • Event Information: Click here.
  • Event Program: Click here.

A keynote presentation by Dan Armbrust, CEO Silicon Catalyst, opens the event with a talk on Realizing the next growth wave for semiconductors – A new approach to enable innovative startups.  If you are one of the Silicon Valley innovators, you might like to hear what Dan shares on this next growth wave.  From my perspective, I suspect it will include being more energy conscious in how we design.  The work on current and emerging IEEE standards that address those energy concerns will follow.  We will review what the conclusions were from the DAC Low Power Workshop and leadership from the IEEE low power standards groups will discuss what they are doing in context of Low Power Workshop.

We then take a lunch break and celebrate 10 Years of SystemVerilog.  The first IEEE SystemVerilog standard (IEEE Std. 1800™-2005) was published in November 2005.  It seems fitting we celebrate this accomplishment.  Joining many of the participants in the IEEE SystemVerilog standardization effort for this celebration will be participants from the Accellera group that incubated it before it became an IEEE standard.  We won’t stop with just celebrating SystemVerilog.  We will also share information on standards projects that have leveraged SystemVerilog, like UVM, which has recently become a full fledged IEEE standards project (IEEE P1800.2).  With so many people who have worked on completed and successful IEEE standards, Accellera offered to bring its Portable Stimulus Working Group members over for a lunch break during their 3-day face-to-face Silicon Valley meeting to mingle with them, to learn from them and hopefully be inspired by them as well.  Maybe some of the success of building industry relevant standards can be shared between the SystemVerilog participants and Accellera’s newer teams.

We will then return to a focus on energy related issues with our first topic area being on power modeling for IP.  Chris Rowen, Cadence Fellow, will take us through some recent experiences on issues his teams have faced driving even higher levels of power efficiency from design using ever more design IP. Tails from the trenches never get old and offer us insight on what we might do in the development of better standards to help address those issues.  While Chris will point to a lot of issues when it comes to the use of design IP, I believe these issues are only compounded when it comes to the Internet of Things (IoT).  We have assembled a great afternoon panel to discuss if the “ultimate power challenge” is IoT.  I can’t wait to hear what they say.

Lastly, when we pull all these systems together, LSI package board issues pose a design interoperability challenge as well.  The IEEE Computer Society’s (CS) Design Automation Standards Committee (DASC) has completed another standard developed primarily outside of North America.  The DASC has a long history of global participation and significant standards development outside of North America, like is the case for VHDL AMS (IEEE 1076.1).  We will hear from the IEEE 2401™-2015 leadership on their newly minted IEEE standard and the LSI package board issues that have been addressed.

We don’t have time to highlight all the EDA & IP standards work in the IEEE, but our principle theme to address issues of power in modern design and verification led us to focus on a subset of them.  So, if your favorite standard or topic area does not appear in the program, let me know and we can add that to our list to consider next year.  And when I say “we,” the work to put together an event like this takes a lot of people. All of us are interested in what we should do for next year and what your input is to us.  For me, in addition to working to collect this, I also need to thank those who did all the work to make this happen.  I’ve often said, as chair, you let the others do all the work.  It has been great to collaborate with my IEEE-SA friends, my peers at the other two Big-3 EDA companies.  It has also been great to get the input and advice on the Steering Committee from two of the world’s largest silicon suppliers (Intel & TSMC) and to include for the first time, support from standards incubators Accellera Systems Initiative and Si2.

, , , , , , , , , , , , , , , ,

10 August, 2015

ASIC/IC Power Trends

This blog is a continuation of a series of blogs related to the 2014 Wilson Research Group Functional Verification Study (click here).  In my previous blog (click here), I presented our study findings on various verification language and library adoption trends. In this blog, I focus on power trends.

Today, we see that about 73 percent of design projects actively manage power with a wide variety of techniques, ranging from simple clock-gating, to complex hypervisor/OS-controlled power management schemes. What is interesting from our 2014 study is that the data indicates that there has been a 19% increase in the last two years in the designs that actively manage power (see Figure 1).

2014-WRG-BLOG-ASIC-11-1

Figure 1. ASIC/IC projects working on designs that actively manage power

Figure 2 shows the various aspects of power-management that design projects must verify (for those 73 percent of design projects that actively manage power). The data from our study suggest that many projects are moving to more complex power-management schemes that involve software control. This adds a new layer of complexity to a project’s verification challenge, since these more complex power management schedules often require emulation to fully verify.

2014-WRG-BLOG-ASIC-11-2

Figure 2. Aspects of power-managed design that are verified

Since the power intent cannot be directly described in an RTL model, alternative supporting notations have recently emerged to capture the power intent. In the 2014 study, we wanted to get a sense of where the industry stands in adopting these various notations. For projects that actively manage power, Figure 3 shows the various standards used to describe power intent that have been adopted. Some projects are actively using multiple standards (such as different versions of UPF or a combination of CPF and UPF). That’s why the adoption results do not sum to 100 percent.

2014-WRG-BLOG-ASIC-11-3

Figure 3. Notation used to describe power intent

In an earlier blog in this series, I provided data that suggest a significant amount of effort is being applied to ASIC/IC functional verification. An important question the various studies have tried to answer is whether this increasing effort is paying off. In my next blog (click here), I present verification results findings in terms of schedules, number of required spins, and classification of functional bugs.

Quick links to the 2014 Wilson Research Group Study results

, , , , , ,

@dennisbrophy tweets

Follow dennisbrophy

@dave_59 tweets

Follow dave_59

@jhupcey tweets

  • #ARM now hiring formal verification engineers in Austin: exciting tech challenge + Ram is a great guy to work with.…https://t.co/uwIXLHWqvg
  • Attention all SF Bay Area formal practitioners: next week Wednesday 7/26 on Mentor's Fremont campus the Verificatio…https://t.co/9Y0iFXJdYi
  • This is a very hands-on, creative role for a verification expert -- join us! https://t.co/jXWFGxGrpn

Follow jhupcey