Hello All,
I have a small query relating to stability and phase margin which has just struck me.
When a compensator is designed for something like a switched mode power supply, often some phase boost is added to alleviate a poor phase margin, and hence stability, at the 'crossover' frequency ( 0 dB gain ), with a pole-zero pair.
I understand this if the original phase curve just rolls off more or less monotonically down past -180 degrees, but suppose as frequency increases the original phase curve rolls off to -150 degrees of phase at F1 when there is still plenty of gain left, but then recovers to -120 degrees at F2 then rolls off to -150 degrees again at the unity gain point F3.
My point is that by concentrating on the unity gain point F3, adding a compensator can indeed reduce the phase lag/increase the phase margin there, but the -150 degrees of lag that existed at F2 will be virtually unaffected, and the gain there will be >>1.
So, why wouldn't such a compensated system be perfectly OK at the unity gain frequency but show a tendency to ring at frequency F2?. Sorry for not having a picture.
Or, put another way, am I wrong in thinking a well compensated system should have a phase margin of 50-60 degrees or more all the way from zero frequency out to the unity gain frequency to avoid noticeable ringing?
Thanks for any replies,
Andy.