Energy Recycling for Sustainability

Many of the ideas for advancing urban sustainability entail “smart” technologies. That is, setting up networks of cameras, detectors, or other sensors that can be useful for more efficiently and effectively managing traffic (including public transit), landscape management, public safety, and public security. Implicit in these kinds of “smart” initiatives is that they rely on computers to collect and process the volumes of data from the sensor networks, often in real time.

But, while energy efficiency and conservation is an integral component of urban sustainability, computers themselves are excessively wasteful of energy: On every clock cycle, which occurs millions of times each second, information that the computer is processing is updated by overwriting digital signals, causing the energy in all the affected transistors to be dissipated as heat. This is increasingly becoming a significant cost of operating (and cooling!) computers. The more so when you realize that very many of the new ideas for enhancing public life ultimately rely on using more and more computing by large server farms. These may be in a city’s Information Technology (IT) center or in the “cloud”, which is to say in a commercially maintained IT center. The bill for powering and cooling a set of large servers – one server farm – easily reaches upwards of millions of dollars annually.

However, computers can be made to operate with great energy efficiency through recycling of signal energy, rather than throwing it away. The technical details have to do with designing computers to use adiabatic circuits to recover almost all of the signal energy for use in subsequent operations. “Adiabatic” is a term from thermodynamics (the theory of heat), which refers to a process that occurs without any heat being transferred. This energy recovery is somewhat akin to that achieved when an elevator descends or the brakes are applied in a hybrid car. And, while the mechanism for energy recovery in these two cases amounts to running a motor in reverse so that it becomes a generator, the mechanism for energy recovery in adiabatic circuits is different in detail; there are no motors on computer chips! Instead, the strategy is to keep the energy in a circuit flowing, without losing any, to avoid needing to call for more energy to be provided by the computer’s power supply. When the power available for computing is limited, this approach can result in substantial gains in overall computational performance for a given power budget.

What will it take for computers to be made more energy efficient? Unfortunately, it will require several developments including designing more and better adiabatic circuits, creating low level computer languages (the “machine code”), and changing how timing is done inside of a computer. These have all been demonstrated in research laboratories, so there is no doubt that this is entirely possible to do (see note below) . Then it will require the chip manufacturing companies and the computer companies to adopt these energy efficient approaches and build them into computer chips and new computers. It remains to be seen if these companies will see enough “business opportunity” in this to be motivated to make the necessary investments. But, when this is achieved, the big computers that operate behind the scenes and handle the growing mountains of data will be much less power-hungry and will run cooler.

So this will mean a cost savings, right? Well, no, not likely. Leaving aside the cost of the new computers, themselves, which will come down in price over time, there is Jevons paradox in economics: Rather than save money through lower energy consumption, the efficiency gains are very likely to increase demand for computing in this new, more useful form, resulting in a rise in the total power that goes into operating even more of these energy-efficient computers. For an example of this, look at the fast multiplying number of uses of energy-efficient LED lights for displays and decoration, day or night.

In this way we see that even this vision for making urban sustainability more prevalent through the use of much more power-efficient computers may very well carry net financial costs to the public. But it would provide a better, more livable and more sustainable built-environment. Considering the alternative, this could be money well spent.

Note: e.g., Maojiao He, Michael P. Frank, and Huikai Xie, “CMOS-MEMS Resonator as a Signal Generator for Fully-Adiabatic Logic Circuits,” Proceedings of SPIE, vol. #5649, paper #18, 2005. PDF file of manuscript available at

Additional thoughts from Stacy Sinclair

This weekend I had the opportunity to visit a home that is connected by the internet of all things. On the way to the house after a physically taxing day, the owner turned to his phone and turned up the heat on the Jacuzzi so it would be perfectly warmed by the time we arrived. As we let the knots fall from our muscles, our group of engineers and architects talked about what it cost to run this amazing home and the benefits and challenges, trade-offs and opportunities of attaining net zero living.  In that space, it was hard to think of a reason not to support such connectivity, yet there are reasons to take pause.

California, and Santa Monica in particular, is making a bold move to inspire new ways of thinking in order to conserve energy without denying residents all the benefits of urban living. Yet, As Dr. Aidun states in this week’s blog, new technologies intended to reduce energy consumption can also increase demand in the long run. Is this going to become the story about a dog running to catch his tail or will we find equilibrium and move the carbon needle in the intended direction? Please join the conversation by contacting me directly at or leaving your thoughts here.

Related News

Stay connected.

Find out first about industry news, upcoming events and updates by signing up for our newsletter.