Wattage for computers + monitors ?

I need to figure out how many 20 amp circuits are needed to power up 24 computers. Is there a standard wattage used for computer + monitors, since they don't exist yet ?

Reply to
Sonco
Loading thread data ...

Sorta crazy, but look on the back of the equipment, get the total for one and multiply by 24. Most electronics have some type of indication of power consumed somewhere.

Reply to
Jim Douglas

I would figure 500 watts for the power supply and 1.5 amps for the monitor making 2 amps per computer. 16*3 is 48. Unless of course your talking about something more than normal work stations.

3 -20 amp circuits would do. Printers or scanners would need to be on other circuits. I would for sure not install any more outlets than needed. That way some maroon does not plug in the vacuum and shut it all down.
Reply to
SQLit

Unfortunately it depends completely on the computers. On a recent "endeavor" I found I could put 8 AMD 2200+ systems (without monitor) on a 15amp circuit without blowing the breaker (120V supply).

Newer computers generally need more juice.

A ROUGH guide that usually will keep you safe is to simply add up the wattages as marked. Usually those numbers are much higher then actual draw, so you SHOULD be OK.

With monitors I'd guess for a 20amp circuit, in a 120V country, you can get away with 6-7 computers. In a 240V country obviously that number would be nearly double.

That's just a guess, you should definitely do the math. TTYL

Reply to
repatch

In the UK, the limiting factors are often the switch-on surge (if you expect the breaker to survive power recovery when all units come on together), and the total circuit earth leakage from so many Class I appliances. When you have allowed for those, you will generally find the steady state current is a long way below the breaker rating and not an issue.

If you are installing in an educational establishment or similar which requires all circuits to be RCD protected at no more than

30mA, then the number is quite small. Each Class I IT appliance can leak up to 0.75mA, so that's 1.5mA for a monitor and base unit. The maximum circuit design leakage is 25% of the RCD rating (actually, that's gone from latest Wiring Regs, but still a good rule of thumb), which is 7.5mA for a 30mA RCD, which is only 5 PC systems per circuit at 1.5mA leakage each.
Reply to
Andrew Gabriel

First there is no answer to your question, and none will be available until you do have more specific information about the computers and monitors.

There is no "standard wattage". We don't know what a "computer

  • monitor" means, and hence no answer is available. If you end up with 22 inch CRT monitors your needs will be very different than if you have 15" LCD monitors. Computers vary over nearly the same range.

Examples: I have a 17" monitor rated at 2.5A, and a 19" LCD monitor rated at 0.4A. I have a 400W power supply rated at 10A input, and 250W supply rated at 5A. Clearly one computer could easily be rated a 15A input (using two CRT monitors and a 400W supply), while another just as likely system might have one LCD monitor and a 250W supply, which adds up to only 1/3rd the current of the other system. 3:1 is too big a ratio to make assumptions from.

My second point is more important: do not overload breakers.

Which means don't even come close to a 20A load through a 20A breaker under any circumstance. The wattages listed on various equipment will be the maximum load, and that is what should add up to something less that 20A.

If you get a clamp on ammeter and measure the current you'd find that at any given time the actual load will be far less than those wattage numbers would indicate. Don't let that fool you into thinking it is okay to add another half dozen computers to that breaker! Perhaps you might figure 115V * 15A = 1725 W as the maximum load per breaker.

(A point worth noting about power supplies, is that the wattage listed is for combined DC output. Your computer will never come close to using that much power though. But the peak inrush when it is turned on might well approach the rated input current. The input power is roughly 2.5 to 3 times the rated output power.)

Be conservative because extra capacity is initially much less expensive than is the cost of adding it later.

Reply to
Floyd L. Davidson

I should know this};-)

There is a formula for everything electical., but you'll need the Highest & Lowest values of given computer systems at least, to work from [do more research] Then you can figure out how to divide the 48 loads equally amonst the needed branch circuits, for computers it's slightly different & you should use computer grade receptacles,( some don't care to be specific ) be specific.

For a 20Amp Branch circuit the maximum load is / oh shucks goto go i'll try & get back to this later. =AEoy

Reply to
Roy Q.T.

| I need to figure out how many 20 amp circuits are needed to power up 24 | computers. Is there a standard wattage used for computer + monitors, since | they don't exist yet ?

Build a uniformly sealed, light tight, box. Punch a tiny hole for a thermostat. Put the computer and monitor inside and turn them on and close up the box. Measure how fast the inside increases temperature through many degrees.

Now repeat the experiement with a 100 watt light bulb running inside. If it goes up faster that the computer, try a smaller bulb. If slower try a larger bulb. When you get the bulb that heats up the inside of the box at the same rate, you have your approximate wattage.

Or save yourself the trouble and get a true-RMS ampmeter.

Reply to
phil-news-nospam

The new building for a Junior college in Kansas has one circuit for each computer. One circuit for each television. And one circuit for each projection screen. All circuits for electronics are isolated ground.

Reply to
Brian

This is fine for working out power consumption for sizing the aircon (although a true power meter would be much easier, and avoid a number of potential experimental errors;-), but it's useless for estimating the current consumption for sizing conductors, circuits, and fuses/breakers. It fails to take into account power factor, inrush current, etc.

Reply to
Andrew Gabriel

Tue, Mar 8, 2005, 6:07am From: snipped-for-privacy@comcast.net (Sonco) I need to figure out how many 20 amp circuits are needed to power up 24 computers. Is there a standard wattage used for computer + monitors, since they don't exist yet ?

Okay, someone said 4/20A circuits should do.

The rule is to compute not less than 180VA for each single or each multiple receptacle on one yoke., AND, a single piece of equipment to a multiple receptacle comprised of 4 or more receptacles shall be computed at not less than 90VA per receptacle.

It is hard to help you any further, I don't know if your useing Wiremold=AE, EMT or just what is the area you intend to add these receptacle to. It would be easy if you had the computers set-up, you could open a circuit and plug a station unit in, measure the amperage from a fully duped computer that is printer + whatever a single station inludes and go from there with the calcs for intended total Amperage ...

I hate seeing computers on EMT & 1900 bes without proper receptacles. they look so cheesy hope your useing somekind of raceway with all the trimmings };-) btw Wiremold and perhaps other Co. have a panel mount Surge Surpressor for computer/sensitve electronics room applications and if I'm not mistake UPS's also, that provide full circuit protection... wheph! I'm done. nice call......

Reply to
Roy Q.T.

"Sonco" wrote in news: snipped-for-privacy@individual.net:

As opposed to one poster, you probably should figure on 700 watts for the computer alone. Newer machines are coming in with as big as 680 watt power supplies. Especially those equiped with Serial ATA or Scuzzy set- ups. From a few glances at some actual power supplie specs, the AC input required for a 550 watt power supply at 115VAC is 10A (6A @ 230VAC). (See Tiger Direct - Power supplies). This is considerably higher than what 550 watts DC calculates out, so there must be some serious losses and/or safety margin built in. Most of the time I find that the actual voltage you will get is 110-

115VAC, so I always figure on 110VAC when rating a circuit. Most 19" monitors I've seen are about 2A @ 110VAC

So, from this information:

((10A*115=1150)/110)=10.45A, 10.45A+2A=12.45A

12.45A*24=298.8A 298.8Ax125% = 373.5A (sounds like a lot huh?, but check the numbers.....) 373.5A/20A = 19 branch circuits.

My suggestion is to set these up on 230VAC, which halves your Amperage load, bringing the number of circuits to 10.

The problem with all of the above...is you have 48 devices, or 24 10.5A+

2A devices. Splitting the load up evenly is going to be a problem based on the above calculations. I would suggest, if you are dead set on 110V power, is to run 24 - 15A branch circuits. This puts each computer on a dedicated circuit, or, conversely, if you want to run 230V power, 12 - 15A circuits and two computers/monitors per circuit.

A branch circuit should be rated for 125% of max expected/calculated load.

Reply to
Anthony

There are also NEC requirements for Information Technology Equipment.

Depending on the location [Building Type] it behooves you to follow them & adhere your work as close as possible. Its covered in Art.645.

If it were my job to I'd run 12/20amp circuits from a subpanel with a disconnect, for 24 computer stations, 2 stations per circuit, in wiremold metal raceway with comp. graded receptacles. and anything else the budget allowed.

but that's just me, dreamin };) =AE

Reply to
Roy Q.T.

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.