Why do computers use so much energy?

Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland’s Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously – why is Microsoft doing this?

There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they’re on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers – a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.

I use a custom watercooling loop to keep my processor and videocard cool, but aside from size and scale, datacenters struggle with the exact same problem – computers generate a ton of heat, and that heat needs to go somewhere.

11 Comments

  1. 2018-10-05 7:06 pm
    • 2018-10-08 7:27 pm
  2. 2018-10-05 7:26 pm
    • 2018-10-05 9:24 pm
  3. 2018-10-06 12:14 am
    • 2018-10-06 12:19 am
  4. 2018-10-07 6:57 am
  5. 2018-10-07 4:29 pm
    • 2018-10-08 6:56 am
  6. 2018-10-08 12:19 pm
    • 2018-10-09 11:52 am