For decades, physicists have wrestled with understanding the thermodynamic cost of manipulating information, what we would now call computing. How much energy does it take, for example, to erase a single bit from a computer? What about more complicated operations? These are pressing, practical questions, as artificial computers are energy hogs, claiming an estimated four percent of total energy consumed in the United States.
* This article was originally published here