bytemangler 0 Posted July 16, 2002 Sorry if this is no the right forum. Does anyone know how to figure how much wattage a server has? I have six computers on a 20A outlet (dedicated). I'm going to be adding a DELL server soon that comes with 900 WATTS. Will it handle additional server?? Thanks Share this post Link to post
Jerry Atrik 0 Posted July 16, 2002 ill try and help. there is a slight difference between rated watts for dc power out and max load ac front side. your ps should say max ac load. your 110 power has about 2200 watts available, but since watts = current * load there is nothing to say that it wont overload the 20 amp breaker a watt is the rating of heat dissipation (very simply put) my advice is to put it on another breaker Share this post Link to post
Uykucu 0 Posted July 20, 2002 I would seperate them 3 by 3. So you will have expansion for another PC for each circuit. If you have a 5KVA or above UPS don't worry it will take care of everything and a 20A breaker is more than enough for that UPS Share this post Link to post
Xiven 0 Posted July 20, 2002 Quote: If you have a 5KVA or above UPS don't worry it will take care of everything and a 20A breaker is more than enough for that UPS Am I right in thinking that electricity in the US is supplied at 110V? If so, a 20A breaker while sufficient in the UK, could potentally not be sufficient in the US, because: Power(Watts) = PD(Volts) x Current(Amps) So a 900W device would draw 4 amps on 230V (UK) and 8 amps on 110V (US). When added to the other devices you have on the circuit it might overload. Just a FYI. Anyway, one way to find out is to try it and see if the breaker trips Share this post Link to post
Uykucu 0 Posted July 20, 2002 Actually true. I overlooked the fact they like their 110V's like brits like their right hand drive But theory is not always correct. So i will withdraw my previous comment. Since I haven't plugged anything bigger than a Notebook or a phone charger in the US Share this post Link to post
Jerry Atrik 0 Posted July 20, 2002 just a theory but i think the reason that us americans started using 110 is because the electric company charges us on "draw" (amps) Share this post Link to post
bytemangler 0 Posted July 22, 2002 Thanks for the reply from everyone. I did my own P=VR thing and having all servers on 1 breaker is pushing it. We're going to run another breaker for the new server, maybe even take some of the servers and move it into that breaker too. thanks Share this post Link to post
videobruce 0 Posted September 6, 2002 Not to be petty, but the standard voltage in the US is 120, NOT the often misnamed 110! It does throw the math off a little. If you have a multimeter with a current scale (20 amp) you could get a more acurate measurement by making up a 'y' adapter between the outlet and the equipment to be measured. Take at least a 16 guage or better yet a 14 guage line cord or piece of romex, cut the hot (black) wire in the middle and splice a short jumper using the same cable/wire to a pair of bananna plugs for your meter input. Put a male plug on one end and a female socket on the other, (making it a extension cord). Plug the bananna plugs into the correct inputs on the meter first, turn the meter on the AC 20 amp scale, then plug the equipment into the socket, then plug into a AC outlet and take a reading. One at a time turn each piece on if you have more than one on the circuit. I have found the rattings on almost all equipment to be overratted by twice the average current draw. The ratting is only the maximum peak or surge starting current which is nowhere near the average running current. Of course if you have a regular clamp on current meter all you need is to split a AC extension cord (cable) and clamp around one leg of the supply to read current. There are meters that will read regular line cords without separating the hot from the neutral. Share this post Link to post