In case you had not noticed, computers are hot—literally. A laptop can pump out thigh-baking heat, while data centers consume an estimated 200 terawatt-hours each year—comparable to the energy consumption of some medium-sized countries. The carbon footprint of information and communication technologies as a whole is close to that of fuel use in the aviation industry. And as computer circuitry gets ever smaller and more densely packed, it becomes more prone to melting from the energy it dissipates as heat.
Now physicist James Crutchfield of the University of California, Davis, and his graduate student Kyle Ray have proposed a new way to carry out computation that would dissipate only a small fraction of the heat produced by conventional circuits. In fact, their approach, described in a recent preprint paper, could bring heat dissipation below even the theoretical minimum that the laws of physics impose on today’s computers. That could greatly reduce the energy needed to both perform computations and keep circuitry cool. And it could all be done, the researchers say, using microelectronic devices that already exist.
To read more, click here.