Archive for June, 2012

13 June, 2012

Well, of course you do – most engineers are pretty good people.  Actually, as much as I’d like to brag about acing my ethics training course (one of those fun corporate things we do every year), that wasn’t really what I was referring to…

Integrity in engineering means that something is as it is supposed to be.  Kind of like structural integrity – there are no holes in it.  For signal integrity, it means that the 1s and 0s you sent from the driver are the same 1s and 0s at the receiver.  For power integrity, it means that the volts and amps you sent to the IC pretty much all got there, for all frequencies of interest – “from DC to daylight.”
I talk more about what this means in my recent article in Electronic Design:
http://electronicdesign.com/article/eda/whats-difference-signal-integrity-power-integrity-73842

What happens if you don’t have integrity?  Well, in buildings, anyway, it can all come crashing down.  Printed circuit boards aren’t much different… crosstalk, loss, and impedance mismatches are all like termites trying to eat away at your signal integrity.  Using an analysis tool like HyperLynx can help keep your 1s and 0s safe.  Simulating to understand design margins ensures that you are designing a strong PCB, that will stand the test of time, just like a good building.

,

12 June, 2012

Impedance is an important concept in many different realms of engineering.
We often see it in our everyday life, especially if you’ve ever hooked up a home entertainment system.  From 8-ohm speaker wire to 75-ohm coaxial cable, the right impedance is crucial to watching things explode on your TV and making sure they sound good too.

Simply stated, impedance describes a relationship between voltage and current.  For a resistor, that is a pretty simple relationship: V = IR.  For a transmission line, however, the relationship is a bit more complicated, since energy is travelling in fields between the incident and return path, usually a trace and a plane.  The characteristic impedance of a transmission line must be calculated using a field solver, and serves as the basis for signal integrity analysis.  For a signal, trace impedance is targeted to “match” the driver and receiver impedance.
For power, impedance should always be at a minimum.  For DC power delivery, that means low resistance, or as much metal as possible (planes, vias, traces).  For AC power delivery, that means very low-inductance connections to a large number and range of decoupling capacitors.  This is one of the fundamental differences between signal integrity and power integrity analysis. 
Check out the article below to find out more:
http://electronicdesign.com/article/eda/whats-difference-signal-integrity-power-integrity-73842

, ,

11 June, 2012

In my previous blog, I talked about how a printed circuit board is nothing more than a path for signals and power to travel to and from ICs.  For a long time, the path was “short” enough to not even matter.  Then signals became fast enough that the board became a signficant part of the circuit, and the realm of analysis known as “signal intgrity” was born.  Really, signal integrity is just the analysis of analog characteristics of digital busses.  Which is a little funny, since many “analog” simulations will ignore the board characteristics (although that trend is changing, as speeds of “analog” busses are ever-increasing).

But the other aspect of design spawning the need for new analysis is the size of the ICs.  There are so many power-hungry transistors on the ICs nowadays that you need to analyze the power feeding them as well.  Just getting the appriopriate DC voltage to these ICs is a challenge, as I discussed last week.  But the AC aspect is also a challenge; decoupling analysis can be pretty tricky.  However, the consequence for ignoring these problems is design failure.

I talk more about the unique aspects of these different kinds of analysis, and compare and contrast them, in my recent article in Electronic Design magazine:
http://electronicdesign.com/article/eda/whats-difference-signal-integrity-power-integrity-73842
Take a peek…

,

6 June, 2012

What?  Isn’t that backwards?  Technically, yes.  The board is merely a pathway through which ICs talk to each other, and receive power to do so.  However, if the power distribution network (PDN) of the board is inadequately designed, it can actually be heating up the ICs.

ICs are supposed to be the main source of heat on a PCB.  Heat is conducted from the ICs to the board through their pins.  Sometimes a metal slug or thermal glue is placed at the base of the component to enhance this effect.  This method of component cooling is sufficient for most components.  (The really power-hungry components will also require a heat sink to help cool them).  So this means that the hottest locations on the board are right beneath the components.  What is beneath the component?  In the case of a BGA, there is usually a dense pinfield.  This pinfield creates a web of copper which will choke the current feeding the IC.  The current has to find its way through narrow pathways created by the “swiss cheese effect” of all the pinfield anitpads in the power and ground planes.  These areas of high current density will start emitting power in the form of heat, which means a drop in voltage for the IC as well as additional heat in that area of the board.

This article discusses the problem in more detail:
http://pcdandf.com/cms/component/content/article/171-current-issue/9056-current-control

Don’t heat up your ICs even more by designing an inadequate PDN. 
How do you avoid that?  Simulate it!

, , , , , , , , , ,

5 June, 2012

For over 50 years, designers have been calculating current-carrying capacity on PCBs using charts created by the Navy in 1956 (more specifically NBS (National Bureau of Standards) Report #4283 “Characterization of Metal-insulator Laminates”, D.S. Hoynes, May 1, 1956. Commissioned by Navy Bureau of Ships).  These charts are still in use today; many designers have probably seen them in the IPC-221 spec.  The problem with using these charts is that they are based upon fixed-width conductors in a specific set of environmental conditions.  The newly released standard pertaining to current-carrying capacity, IPC-2152, has expanded the number of charts to include different scenarios.  However, even the IPC-2152 spec advocates simulation as the most accurate means of predicting current-carrying capacity.

The issue has to do with a number of factors, including the co-dependence of current and temperature.  As the current through a conductor increases, so does the temperature in that conductor.  Since temperature affects conductivity, it will also affect the current through a given conductor size.  This co-dependence spawns the need for co-simulation between power integrity and thermal analysis.  Such analysis can also take into account the complicated, non-uniform shapes that are used to carry current in most modern PCBs, including the non-uniformity of current distribution as well as temperature distribution.

To find out more, take a look at my recent article on the subject in Printed Circuit Design and Manufacture Magazine:
http://pcdandf.com/cms/component/content/article/171-current-issue/9056-current-control

, , , , , , ,

4 June, 2012

Simulation is a way of predicting reality.  The more information we put into the simulation, the better our prediction of what is really going to happen.  Certain aspects of electrical simulation, like signal integrity, can be simulated relatively “independently” of other influencing forces.  Sure, there are some temperature dependencies on silicon behavior, and those are typically represented by IBIS models created at different temperatures for fast, typical, and slow silicon behavior.  And also, the issues of vias tend to complicate things, as they blur the world between traditional signal integrity and power integrity as well as 3D electromagnetic simulation.  However, no two disciplines are more closely related than thermal analysis and power integrity, more specifically DC Drop.

The voltage drop across a plane is determined by the conductivity of the copper.  Copper conductivity changes 4% for every 10degC of temperature change.  That equates to a 32% change for an 80degC temperature rise, which is pretty significant.  So, in order to find an accurate measure of the voltage drop, temperature needs to be included.  The other interesting aspect of this is that drops in voltage mean power is being dissipated by the plane, power that is being dissipated as heat.  So, the results of each of these analyses will affect one another.  Hence, the need for PI/Thermal Co-simulation.

You can read more about this in my recent article in Printed Circuit Design and Manufacturing Magazine:
http://pcdandf.com/cms/component/content/article/171-current-issue/9056-current-control

, , , , , , ,

Try HyperLynx Free

Get immediate, hands-on access to a variety of HyperLynx features. Built-in data files and tutorials will help you become familiar with the HyperLynx environment and experience the power of the HyperLynx tools.

Get Started Now

@MentorPCB tweets

  • Introduction to Valor--top level view on solutions, illustrating the essential software tools for PCB manufacturing. http://t.co/bHMUmtvfl9
  • Follow Andre's new blog series on xDM Library (DMS) line of products-- hear his definition of the "perfect" Taxonomy http://t.co/cK8NYm987G
  • Why do PCB designers always say they don't have time for autorouting? Vern elaborates in his blog..http://t.co/RBnOvsk77R

Follow MentorPCB