This has not just occurred to me. I just seems so silly, I never asked.
I have a single phase MIG that puts out a max of 220Amp. AFAIK this would be at around a voltage of say, 20+ volts.
I run on 240 Volt 15 AMp = 3600 volt/amps
Assuming a 100% efficiency (unlikely), the welder uses at least 4400 watts (and even more volt amps). It's probably more like 5500. This is DC, which is _roughly_ equivalent to the RMS of the input sine wave, again at best. It certainly measures the power that melts the fuse.
What makes it worse is that MigoMag, who make my welder, have made a tripple voltage ( 3ph, 480 Farm, 240) model that gets a claimed 355 Amps at 240 Volts (I assumed 15 Amp)! Even an inverter getting this is weird.
I realise that stickout length, wire feed speed etc control current a lot, but assuming I weld as hot as I can, the only thing I can think of is the fact that the wire is pinching and breaking the arc repeatedly, therefore lowering the average current ???????
This would explain why a "220 Amp" MIG is not as effective as a 220 Amp stick welder?
Waiting!
**************************************************** sorry remove ns from my header address to reply via emailImagine a _world_ where Nature's lights are obscured by man's. There would be nowhere to go. Or wait a while. Then you won't have to imagine.