How Much Heat Does a Server Generate?

How much heat does a server - or any other piece of ICT gear - generate? Here is how you can calculate it.

The basic formula is BTU/hr = 3.4 * watts.

So a server package rated at 400 watts (3.3 amps @ 120V) running at maximum would theoretically generate 1,360 BTU/hr. Most equipment traditionally did not run at maximum utilization, but now Green ICT technologies/practices like virtualization are pushing in that direction. Some manufacturers used to publish "nameplate" amperage and/or wattage ratings that conservatively exceeded real-world maximums, but that practice has declined as products increasingly compete on energy efficiency.

Many server manufacturers publish the heat output of their gear. These specs pretty much follow the above formula, although BTUs can run a little (<10%) higher to account for power supply inefficiencies.

Will liquid-cooled computers make a comeback?