I recently purchased some stepper motors and drives. The drive has an adjustable max current setting of .25 to 1.4 amp RMS or (.35 to 2.0 amp peak). It has an input voltage range between 12 and 24 VDC. I realize that increasing voltage won't change the max current through the windings, only speed up the time to reach max current.
The motors in question are size 17, bipolar, hybrid stepping motors. The documentation sent with the motors says they were tested at "1.7 amp peak" and there are stickers on the motor which say "1.25A" which I'm assuming means RMS. I'm assuming these are the maximum ratings. All the amperage figures are per phase. The fastest I conceivably would like to turn the motors would be about 2800 pulses/sec, or roughly 7 rev/s ? they were tested far past that. I think that for my application I may want to suck every bit of torque out of these motors as possible and I'm already assuming that I'll need to supply them with 24V. This is all very theoretical at this stage though, everything is still in the box.
I have seen some post suggesting that running steppers at currents higher than the rated current is sometimes done. Is this advisable??? Is an increase from 1.25 amp to 1.4 amp/phase (the max available with my drives) a big deal? Am I likely to see a big increase in torque? Am I likely to see it more at slow speed, high speed, both??? Say if I were to slowly increment the current up from 1.25 amp to 1.4 amp to test, is there any way I could tell when I'm nearing the danger of burning out the motor? How would duty cycle play into this? For instance, if I expect to move the motor for only a 1 or 2 seconds at a time and possibly stop for most of a minute in between moves will overheating be an issue (although, I may be using partial or full current for holding torque some of the time)?
Any help would be greatly appreciated. Thanks,