10
\$\begingroup\$

I have a BLDC motor driving circuit and there are a couple things which have been difficult to figure out / calculate. For example MOSFET switching speed, MOSFET switching losses, voltage spikes coming from the BLDC motor, etc. I have been searching for a while, and to my knowledge it seems like there is no "one equation" or method that allows you to really calculate values such as the ones listed above and the best you can do is estimate these.

I am wondering if I input a spice model of my MOSFET gate driver, MOSFET, & motor parameters into a simulator such as LTspice, how accurate can this circuit really be in estimating these values listed above? Can I actually trust the simulation values LTspice gives me for a complicated circuit such as this? Or is it just easier and simpler (but expensive) to build the physical circuit and probe with an oscilloscope?

\$\endgroup\$
2
  • \$\begingroup\$ An electric motor is a complicated system. You can probably make (or find online) a model, but it will be a large effort to get to the point where you feel comfortable it is an accurate representation of your target motor. This is an example where the time spent in the simulator might be better spent in the lab. \$\endgroup\$ Commented Jun 24 at 14:08
  • \$\begingroup\$ @Dave Tweed 's answer is saying that in some cases models can be better than reality :-). I say that somewhat flippantly, but also in deadly earnest. Reality will vary with multiple imprecise parameters.. Some objects are more precisely generalised than others (eg a 1k zero inductance resistor). Motors are among the less precisely modelable items. \$\endgroup\$ Commented Jun 25 at 7:43

6 Answers 6

11
\$\begingroup\$

Spice is basically as accurate as the device models it uses.
The difficult one is likely the motor parameters, as you need to model the dynamic electrical effects of its load.
Spice models of MOSFETS from their vendor are likely good enough to give reasonable accuracy for the switching speed and switching losses.

Of course once you have a simulation that gives you desired results, you then need to breadboard the actual circuit to see if it works as designed.

\$\endgroup\$
8
\$\begingroup\$

There are a few factors that contribute to (in)accuracy.

  • The device models themselves- unless you are a modelling expert you'll probably be using models supplied by the manufacturers. Usually they're pretty good, at least for the main intended applications. They may fail to model some behaviors in the interest of speed or simplicity.

  • The circuit diagram you feed into simulation- if you don't bother to (or don't accurately) model parasitic effects (and if they matter) you can get inaccuracies due to that. They become more important as frequencies get higher and higher (also if di/dt or dv/dt is high at relatively low frequencies).

  • A simple simulation will have the devices all at the default temperature. Typically you'll run simulations at temperature extremes too. However the devices in a power circuit are typically not all at the same temperature. The MOSFETs may run much hotter and therefore have much higher Rds(on). Especially in analog circuits this can cause secondary effects (such as distortion due to temperature changes of transistors self-heating within a waveform) that are hard to model accurately.

  • The simulation settings- accurately simulating switching power supplies is a bit more challenging than some kinds of simulation because you're looking for nanosecond effects while operating at relatively low frequency. That means a lot of computation. I believe Mike E. had optimizing that behavior as a key goal when he created LTspice for LTC, and it involves some smarts. Sometimes you have to override the settings to get an excellent result.

enter image description here

Tinkering with a breadboard or tinkering with a simulation is similar in one way- you're typically dealing with a single instance of each of the devices (and similar simulated devices tend to be perfectly matched, which is not the case in reality). If you happen to get particularly good devices for your prototype and production orders come in that are not as good (but still within spec) your design could be a failure even though the simulation and prototype worked fine. Models are usually made to represent 'typical' devices, not worst-case. It's also possible (if not likely) that devices sold as compatible with similar part numbers from different manufacturers might have some material difference that could be uncovered by your design. Accounting for differences in a simulation is often non-trivial (but it's easier than trying to buy parts that are near the edges of allowable parameters).

It's better to treat breadboarding and simulation as design verification tools.

\$\endgroup\$
7
\$\begingroup\$

The parameters you're looking for tend to have a broad statistical distribution because they are generally difficult to control precisely during the manufacturing process. What you end up estimating is a mean value for some number of real physical devices, none of which will have that exact value.

That's why we design circuits to work correctly over a range of device parameter values, and then do Worst-Case Circuit Analysis (WCCA) to confirm.

And it bears repeating: All models are wrong. Some models are useful.


Monte Carlo simulation, in which multiple runs are made with device parameters randomly chosen for each run, can give you a sense of the range of performance you can expect, and also to which parameters the circuit is most sensitive.

\$\endgroup\$
1
  • \$\begingroup\$ Trev347 Dave is saying that in some cases models can be better than reality :-). I say tyhat somewhat flippantly, but also in deadly earnest. Dave: I expaned that slightly in a comment on his question. You are welcome to comment and/or disagree. \$\endgroup\$ Commented Jun 25 at 7:41
2
\$\begingroup\$

There are a variety of reasons why simulations can fail.

In many cases, circuit simulation tools can be thought of as numerical solvers of differential equations. To numerically solve diffeqs, one must integrate. There are a variety of different algorithms for differential equations, each with their own shortcomings in different situations. The shortcomings often have to do with how much error is accumulated in the integration process, and what types of situations these errors can damage your simulation. I suspect most reputable circuit solvers know this, and apply the optimal techniques in the optimal circumstances -- but these techniques are often user-selectable.

For a concrete example, it's tougher to make a simulation work perfectly when time scales differ by orders of magnitude. Don't expect the simulation that accurately shows millisecond behavior to accurately describe behavior over hours.

Also, garbage in, garbage out. Different models can be used in different situations. Don't expect a simulation to show you the impact of effective series resistance in a capacitor when the model you've selected for the capacitor is an ideal model.

\$\endgroup\$
2
\$\begingroup\$

You can trust your simulator to be the tool that it is

To paraphrase something Von Neumann once suggested: the only way to model an infinitely complex system is to use the system itself.

Simulators are tools. Like hammers. They're useful within the scope of their intended use. No more. No less. The reason (and this is important) that we still send our children to school to become engineers and scientists is that the tool will never be as valuable as the human mind.

Frankly, a simulator should be more than adequate to achieve your goals. After all, one of the first things a student learns about engineering is the value of tolerances and conditions. If your goal is to know exactly to the precision of a gnat's eyebrow what's going to happen, then you've failed to understand those basic axioms of engineering. Because...

  • Manufacturing isn't that precise.
  • Conditions of operation/use (e.g. ambient temperature) aren't that precise.
  • The materials used to manufacture aren't that precise.
  • Heck, our understanding of the universe simply isn't that precise.

We therefore use simulators knowing that it's not just the user's responsibility to understand the limits of the tool but the limits of the world and, above all, the limits of one's self. If the constructed design fails where the simulator said it should have succeeded, it's 98% of the time the fault of the designer to accommodate proper tolerances or to simulate all of the possible expected use conditions.

And I know that from personal experience. I once designed a power buffer. Thirty-two bits of high-capacitance drivers. I didn't properly simulate what would happen to the on-chip power plane when all thirty-two bits transitioned high-to-low simultaneously. I still remember the fab techs telling me how much fun they had vaporizing my chips. Apparently the destruction of the ground plane caused the distribution of boron in the packaging to change which lowered the vaporization point of the package to the point where the whole thing went Foof! Not Bang!... Foof! Total protonic reversal, if you get my drift. I'm told it was spectacular. And properly simulating the ground plane's response to that condition proved what would happen (the ground plane failure, not the Foof!) — but I was a young engineer and forgot to do it.

The simulator will work fine so long as your expectations include necessary tolerances and conditions. That's good engineering.

\$\endgroup\$
-1
\$\begingroup\$

In some cases, not accurate at all! I'm reminded of this https://cacm.acm.org/research/analysis-of-unconventional-evolved-electronics/ has a few great case study where 'evolving' a circuit within an FPGA they managed to create a circuit which could discern between two different input frequencies, but no-one could actually explain how it worked!

"Until the present study, one could only speculate as to the circuit’s means of operation, so unusual are its structure and dynamics." (after the 'current study' they sort of understood what it was doing, but could never reproduce the behaviour in a simulator.

In this case the behaviours were beneficial or desirable, but surely the same must be true of real circuits having other unexpected side-effects. There is always going to be a gap between the simulation and the real thing.

\$\endgroup\$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.