Simulation Experiments (Part 3)

It’s time to continue my short series on simulation experiments. In Part 1 and Part 2 I discussed the basics of simulation experiments and general features of Experiment Manager, the SystemVision application for simplifying, executing, and managing simulation experiments. In this post I’ll use a simple motor driver example to illustrate the basics of Experiment Manager operation. Here is the circuit:

 graphic_motor_circuit_resized

 

 

 

 

 

 

 

 

 

 

 

Setting up and running an experiment requires two types of system information: a list of parameters whose values can change during the experiment, and one or more saved measurements to execute for each experiment run. For my motor drive system, I want to look at two performance metrics related to the position of the motor shaft: risetime for the shaft to move to its final position, and the maximum of the shaft’s position. For system parameters, I will focus on parameters in the motor including the winding resistance, winding inductance,  torque constant, damping coefficient, and moment of inertia.

Too determine how much effect each motor parameter has on my measurements, I ran a simple sensitivity analysis. Based on my analysis setup, the risetime is most affected by the motor’s winding resistance and damping coefficient, and the damping coefficient has the largest effect on the shaft’s final position. So I setup my experiment to adjust values for the winding resistance and damping coefficient while measuring risetime and position.

Experiment Manager’s GUI-driven integration with Microsoft Excel allows me to quickly setup the experiment spreadsheet. The spreadsheet contains a single column for each parameter and measurement in my experiment. Once defined, here is what the spreadsheet looks like:

 experiment_manager_1 

 

 

 

 

 

With the parameter and measurement columns defined, each row in the spreadsheet becomes an individual experiment. I define the values in the parameter columns, and SystemVision fills in the measurement columns with calculated information from the experiment. Here is what my spreadsheet looks like after running a series of experiments:

 experiment_manager_2

 

 

 

 

 

 

 

 

 Note the cell shading in the measurement columns. Experiment Manager allows me to set minimum and maximum limits for measurements, and then flags the measured data by coloring the cell based on whether the measurement is within, or outside of, measurement limits (e.g. in the above example, green indicates a passed measurment, oranage a failed measurement). Now I have two sets of complimentary data: waveforms in SystemVision which I can further analyze graphically, and numeric data in Excel that I can analyze using spreadsheet functions.

System experiments are an essential part of designing and verifying a mechatronic system design. But setting up experiments often requires significant time, particularly if it’s a manual process. Tools like Experiment Manager speed the experiment definition process and help design teams automate experiment setup and execution to make design flows more efficient and improve system quality.

Post Author

Posted May 9th, 2011, by

Post Tags

, , ,

Post Comments

No Comments

About Mike Jensen's Blog

Views, insights, and commentary on mechatronic system design and analysis. Mike Jensen's Blog

Comments

Add Your Comment

Archives

June 2014
  • Wow Factor
  • May 2014
  • SystemVision 5.10.3
  • March 2014
  • IESF 2014: Military & Aerospace
  • Engineering Oops!
  • Big Engineering
  • January 2014
  • SystemVision Model Wizard
  • December 2013
  • SystemVision 5.10.2
  • Modeling: An Engineer’s Dilemma
  • October 2013
  • What is Your Legacy?
  • September 2013
  • Automotive IESF 2013
  • July 2013
  • Simple Design Solutions
  • June 2013
  • SystemVision 5.10
  • May 2013
  • Engineering Muscle Memory
  • EDA vs. Windows 8
  • March 2013
  • VHDL-AMS Stress Modeling – Part 3
  • January 2013
  • VHDL-AMS Stress Modeling – Part 2
  • VHDL-AMS Stress Modeling – Part 1
  • December 2012
  • Practice! Practice!
  • November 2012
  • Sharing Tool Expertise
  • October 2012
  • Preserving Expertise
  • Virtual Prototyping — Really?
  • Innovations in Motion Control Design
  • September 2012
  • Game Changers
  • Do We Overdesign?
  • August 2012
  • Tsunami Remnants
  • July 2012
  • A New Look at Device Modeling
  • SystemVision 5.9
  • June 2012
  • Veyron Physics
  • May 2012
  • Rooster Tail Engineering
  • April 2012
  • Automotive IESF 2012
  • Teaching and Learning CAN Bus
  • March 2012
  • Analog Modeling – Part 6
  • Analog Modeling – Part 5
  • Analog Modeling – Part 4
  • February 2012
  • Analog Modeling – Part 3
  • Analog Modeling – Part 2
  • January 2012
  • Analog Modeling – Part 1
  • Connecting Tools and Processes
  • December 2011
  • Turning-Off and Tuning-In
  • Use vs. Experience
  • Analyzing the Big Picture
  • November 2011
  • Simulating for Reliability
  • October 2011
  • SystemVision 5.8
  • VHDL-AMS Model Portability — Fact or Fiction?
  • September 2011
  • IESF 2011 Moves to Frankfurt
  • Simulation Troubleshooting
  • August 2011
  • Qualities of VHDL-AMS Quantities
  • Military & Aerospace IESF 2011
  • Touring Johnson Space Center
  • July 2011
  • Engineering versus Science
  • June 2011
  • System Reengineering
  • May 2011
  • Integrating Hardware and Software Design
  • Engine Remote Start
  • Integrated System Design
  • Simulation Experiments (Part 3)
  • April 2011
  • Automotive IESF 2011
  • Pushbutton Cars
  • System Simulation with FEA-Base Motor Models
  • March 2011
  • Simulation Experiments (Part 2)
  • Simulation Experiments (Part 1)
  • Japan: Patience and Grace Amid Disaster
  • Top Gear = Driving Fun
  • February 2011
  • Buoyancy
  • Ideas in Motion
  • January 2011
  • The Mechanical Half of Mechatronics
  • Detroit Auto Show
  • Signal-flow vs Conserved System Modeling
  • SystemVision 5.7…Ready, Set, Go!
  • December 2010
  • SystemVision and Windows 7
  • Friction Vacation
  • Simulation Beyond Volts and Amps (Part 4)
  • November 2010
  • Simulation Beyond Volts and Amps (Part 3)
  • Simulation Beyond Volts and Amps (Part 2)
  • Simulation Beyond Volts and Amps (Part 1)
  • October 2010
  • SAE Convergence Recap (and an Unexpected Surprise)
  • VHDL-AMS Black Belt
  • Converging on SAE Convergence
  • System Design vs System Repair
  • September 2010
  • What’s the “AMS” in VHDL-AMS?
  • How Sensitive is Your System?
  • Do You Trust Your Simulator?
  • August 2010
  • What’s in a SPICE Model?
  • Cycling + Gravity = Pain
  • NI Week: Fun for Engineers
  • June 2010
  • Are You a Flexible Thinker?
  • VHDL-AMS and Switch Hysteresis
  • May 2010
  • VHDL-AMS Revisited
  • Segway to U3-X
  • Atomic Glue
  • March 2010
  • IESF Recap
  • February 2010
  • IESF is Coming…
  • System Level HDL-topia
  • January 2010
  • Mastering Design Abstraction
  • The Joy of Disassembly