I'm about to install a Dell SC1425 computer with a 450W powersupply at an internet hosting provider.
Reading the terms and conditions of the provider, I notice that they charge for power usage above 0.5 ampere.
I seem to recall from high school physics that V * A = W. I'm connected to a 220Volt AC powergrid, so with a 450W power supply I could be using a maximum of just over 2 ampere, which would result in a high cost (higher than what I would pay for internet traffic).
Is this calculation realistic? I can imagine that the computer's powersupply won't be using 450W all the time, but even if it uses 200W on average, that still works out to more than .4 ampere above the limit, and I would then get billed for each .1 ampere above that (0.5) limit.
Thanks
Miyagi