Why Do Computers Use So Much Energy?

Online now: Google [Bot], Majestic-12 [Bot]
Post Reply
maurvir Meat popsicle
User avatar
https://blogs.scientificamerican.com/ob ... ch-energy/

Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland’s Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously—why is Microsoft doing this?

There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they’re on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers—a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.

These issues don’t only arise in artificial, digital computers. There are many naturally occurring computers, and they, too, require huge amounts of energy. To give a rather pointed example, the human brain is a computer. This particular computer uses some 10–20 percent of all the calories that a human consumes. Think about it: our ancestors on the African savanna had to find 20 percent more food every single day, just to keep that ungrateful blob of pink jelly imperiously perched on their shoulders from having a hissy fit. That need for 20 percent more food is a massive penalty to the reproductive fitness of our ancestors. Is that penalty why intelligence is so rare in the evolutionary record? Nobody knows—and nobody has even had the mathematical tools to ask the question before.

There are other biological computers besides brains, and they too consume large amounts of energy. To give one example, many cellular systems can be viewed as computers. Indeed, the comparison of thermodynamic costs in artificial and cellular computers can be extremely humbling for modern computer engineers. For example, a large fraction of the energy budget of a cell goes to translating RNA into sequences of amino acids (i.e., proteins), in the cell’s ribosome. But the thermodynamic efficiency of this computation—the amount of energy required by a ribosome per elementary operation—is many orders of magnitude superior to the thermodynamic efficiency of our current artificial computers. Are there “tricks” that cells use that we could exploit in our artificial computers? Going back to the previous biological example, are there tricks that human brains use to do their computations that we can exploit in our artificial computers?

More generally, why do computers use so much energy in the first place? What are the fundamental physical laws governing the relationship between the precise computation a system runs and how much energy it requires? Can we make our computers more energy-efficient by redesigning how they implement their algorithms?

It's a lot generalized, but the implications are interesting. What if we could find a way to make computing more thermodynamically efficient? The biggest impediment right now to higher performance is heat generation, particularly in mobile devices. If we figure out how to make these devices more efficient in their energy usage, could we have true supercomputing capability in our phones one day?
Water cooling taken nearly to the limit.
As parts get smaller the watts per square inch generated goes up. Its an ever worsening problem. Pulling the heat out of there safely has always been a challenge. That aspect of design occupied my time for decades.
The problem is ensuring there is sufficient contrast between an on and an off bit to prevent confusion, which becomes more difficult when the bits get smaller.
iDaemon infinitely loopy
User avatar
User avatar
iDaemon posted:
https://grist.org/article/video-games-c ... n-produce/

Found this article and I think it goes here.

So just how big is gaming’s environmental footprint? Globally, PC gamers use about 75 billion kilowatt hours of electricity a year, equivalent to the output of 25 electric power plants. (And that doesn’t include console games.) In the United States, games consumes $6 billion worth of electricity annually — more power than electric water heaters, cooking appliances, clothes dryers, dishwashers, or freezers. As the report concludes, “video gaming is among the very most intensive uses of electricity in homes.” And more power means more greenhouse gas emissions: American gamers emit about 12 million tons of carbon dioxide annually — the equivalent of about 2.3 million passenger cars. Games are rated for things like sex and violence, Mills points out, but games and gear are “silent on their carbon footprint.”

What’s more, games’ impact could balloon as their market keeps expanding. “This isn’t the domain of 15-year-old boys anymore,” Mills says. “This is something that two-thirds of American households are engaged in. And what does it mean for the population? It’s a lot of energy and a lot of carbon.” Within five years, the electricity demand for gaming in California could rise by 114 percent, according to the report.

A few years ago I logged the electrical consumption of all my workstations as well as other devices at my business and at home. I used a nifty wall plug meter as well as a heavy duty clamp ammeter for the big stuff. I posted about it at the time. Powerful but less efficient workstations are only turned on when they are required, and then shut down.

I shut off all unneeded devices at their power bars to minimise leakage. The only computer I have to leave on is an AMD K6 based PC (1998 vintage) that maintains my cash register. It runs through its database packing and backup functions after hours.

My measly efforts are just wasting in the wind but I believe there will be a reckoning.
Subsequent topic  /  Preceding topic
Post Reply

Why Do Computers Use So Much Energy?