I need to figure out how many 20 amp circuits are needed to power up
24 computers. Is there a standard wattage used for computer +
monitors, since they don't exist yet ?
As opposed to one poster, you probably should figure on 700 watts for the
computer alone. Newer machines are coming in with as big as 680 watt
power supplies. Especially those equiped with Serial ATA or Scuzzy set-
ups.
From a few glances at some actual power supplie specs, the AC input
required for a 550 watt power supply at 115VAC is 10A (6A @ 230VAC). (See
Tiger Direct - Power supplies). This is considerably higher than what 550
watts DC calculates out, so there must be some serious losses and/or
safety margin built in.
Most of the time I find that the actual voltage you will get is 110-
115VAC, so I always figure on 110VAC when rating a circuit.
Most 19" monitors I've seen are about 2A @ 110VAC
So, from this information:
((10A*115=1150)/110)=10.45A, 10.45A+2A=12.45A
12.45A*24=298.8A
298.8Ax125% = 373.5A (sounds like a lot huh?, but check the numbers.....)
373.5A/20A = 19 branch circuits.
My suggestion is to set these up on 230VAC, which halves your Amperage
load, bringing the number of circuits to 10.
The problem with all of the above...is you have 48 devices, or 24 10.5A+
2A devices.
Splitting the load up evenly is going to be a problem based on the above
calculations.
I would suggest, if you are dead set on 110V power, is to run 24 - 15A
branch circuits. This puts each computer on a dedicated circuit, or,
conversely, if you want to run 230V power, 12 - 15A circuits and two
computers/monitors per circuit.
A branch circuit should be rated for 125% of max expected/calculated
load.
--
Anthony
You can't 'idiot proof' anything....every time you try, they just make
better idiots.
Remove sp to reply via email