Simulation Experiments (Part 2)

In my last blog post (Simulation Experiments (Part 1)) I introduced the topic of creating and running simulation experiments. As I mentioned, experiments are a way to ask system performance questions and get corresponding answers – nothing new here, since system modeling and simulation were developed specifically for this purpose. What is new, however, are methodologies developed to ask the questions, calculate the answers, and analyze the results.

Defining an experiment starts with deciding what you want to know about your system – in other words, the questions you want to ask. For systems, you usually want to quantify one or more performance metrics like system gain, risetime, overshoot, delay, etc. With questions in hand, the next step is determining which design parameters contribute most to performance variations. You might argue that a key reason for running design experiments is figuring out which parameters to tweak in order to get a desired performance – and you can certainly create experiments to do just that. But knowing ahead of time the list of key design parameters improves your experiment efficiency. But how do you determine which parameters most affect a performance metric? Two words: sensitivity analysis (see my blog post “How Sensitive is Your System?”). Knowing the performance metrics you want to focus on, and the parameters that most affect the metric, defines both the analysis and measurements you need to run. You are now ready to setup your experiments.

Experiment Manager, SystemVision’s application for defining and running simulation experiments, is a GUI-driven integration connecting SystemVision and Microsoft Excel. Using this integration, Excel can access the SystemVision design database as well as send simulation and analysis commands to the simulator. After starting Experiment Manager from the SystemVision menus and naming your experiment, you define your experiment in 4 easy steps: [1] select the parameters you want to control  during the experiment — each parameter defines a unique column in the experiment spreadsheet; [2] choose the analysis command file, which controls the analysis type and parameters; [3] select your performance measurements, including any minimum and maximum limits – each measurement defines a unique spreadsheet column; [4] set parameter values in the experiment spreadsheet. Once your experiment is defined, simply click the Simulate button in Excel to run the experiment. Simulation and analysis commands are sent to SystemVision; experiment results are displayed graphically in the SystemVision waveform analyzer, and in tabular format in Excel. In the waveform analyzer you can further analyze experiment results using both graphical measurements and transforms. In Excel you can use built-in number crunching capabilities to further analyze numeric data.

Experiment Manager simplifies the creation, execution, and management of simulation experiments. In my next blog post I will show you an example of using Experiment Manager to define and run experiments on a mechatronic system.

Post Author

Posted March 29th, 2011, by

Post Tags

, ,

Post Comments

No Comments

About Mike Jensen's Blog

Views, insights, and commentary on mechatronic system design and analysis. Mike Jensen's Blog

Comments

Add Your Comment

Archives

June 2014
  • Wow Factor
  • May 2014
  • SystemVision 5.10.3
  • March 2014
  • IESF 2014: Military & Aerospace
  • Engineering Oops!
  • Big Engineering
  • January 2014
  • SystemVision Model Wizard
  • December 2013
  • SystemVision 5.10.2
  • Modeling: An Engineer’s Dilemma
  • October 2013
  • What is Your Legacy?
  • September 2013
  • Automotive IESF 2013
  • July 2013
  • Simple Design Solutions
  • June 2013
  • SystemVision 5.10
  • May 2013
  • Engineering Muscle Memory
  • EDA vs. Windows 8
  • March 2013
  • VHDL-AMS Stress Modeling – Part 3
  • January 2013
  • VHDL-AMS Stress Modeling – Part 2
  • VHDL-AMS Stress Modeling – Part 1
  • December 2012
  • Practice! Practice!
  • November 2012
  • Sharing Tool Expertise
  • October 2012
  • Preserving Expertise
  • Virtual Prototyping — Really?
  • Innovations in Motion Control Design
  • September 2012
  • Game Changers
  • Do We Overdesign?
  • August 2012
  • Tsunami Remnants
  • July 2012
  • A New Look at Device Modeling
  • SystemVision 5.9
  • June 2012
  • Veyron Physics
  • May 2012
  • Rooster Tail Engineering
  • April 2012
  • Automotive IESF 2012
  • Teaching and Learning CAN Bus
  • March 2012
  • Analog Modeling – Part 6
  • Analog Modeling – Part 5
  • Analog Modeling – Part 4
  • February 2012
  • Analog Modeling – Part 3
  • Analog Modeling – Part 2
  • January 2012
  • Analog Modeling – Part 1
  • Connecting Tools and Processes
  • December 2011
  • Turning-Off and Tuning-In
  • Use vs. Experience
  • Analyzing the Big Picture
  • November 2011
  • Simulating for Reliability
  • October 2011
  • SystemVision 5.8
  • VHDL-AMS Model Portability — Fact or Fiction?
  • September 2011
  • IESF 2011 Moves to Frankfurt
  • Simulation Troubleshooting
  • August 2011
  • Qualities of VHDL-AMS Quantities
  • Military & Aerospace IESF 2011
  • Touring Johnson Space Center
  • July 2011
  • Engineering versus Science
  • June 2011
  • System Reengineering
  • May 2011
  • Integrating Hardware and Software Design
  • Engine Remote Start
  • Integrated System Design
  • Simulation Experiments (Part 3)
  • April 2011
  • Automotive IESF 2011
  • Pushbutton Cars
  • System Simulation with FEA-Base Motor Models
  • March 2011
  • Simulation Experiments (Part 2)
  • Simulation Experiments (Part 1)
  • Japan: Patience and Grace Amid Disaster
  • Top Gear = Driving Fun
  • February 2011
  • Buoyancy
  • Ideas in Motion
  • January 2011
  • The Mechanical Half of Mechatronics
  • Detroit Auto Show
  • Signal-flow vs Conserved System Modeling
  • SystemVision 5.7…Ready, Set, Go!
  • December 2010
  • SystemVision and Windows 7
  • Friction Vacation
  • Simulation Beyond Volts and Amps (Part 4)
  • November 2010
  • Simulation Beyond Volts and Amps (Part 3)
  • Simulation Beyond Volts and Amps (Part 2)
  • Simulation Beyond Volts and Amps (Part 1)
  • October 2010
  • SAE Convergence Recap (and an Unexpected Surprise)
  • VHDL-AMS Black Belt
  • Converging on SAE Convergence
  • System Design vs System Repair
  • September 2010
  • What’s the “AMS” in VHDL-AMS?
  • How Sensitive is Your System?
  • Do You Trust Your Simulator?
  • August 2010
  • What’s in a SPICE Model?
  • Cycling + Gravity = Pain
  • NI Week: Fun for Engineers
  • June 2010
  • Are You a Flexible Thinker?
  • VHDL-AMS and Switch Hysteresis
  • May 2010
  • VHDL-AMS Revisited
  • Segway to U3-X
  • Atomic Glue
  • March 2010
  • IESF Recap
  • February 2010
  • IESF is Coming…
  • System Level HDL-topia
  • January 2010
  • Mastering Design Abstraction
  • The Joy of Disassembly