Tuesday, May 31, 2011

Code Green: Energy-efficient Programming To Curb Computers' Power Use

Soaring energy consumption by ever more powerful computers, data centers and mobile devices has many experts looking to reduce the energy use of these devices. Most projects so far focus on more efficient cooling systems or energy-saving power modes. A University of Washington project sees a role for programmers to reduce the energy appetite of the ones and zeroes in the code itself. Researchers have created a system, called EnergJ, that reduces energy consumption in simulations by up to 50 percent, and has the potential to cut energy by as much as 90 percent. They will present the research next week in San Jose at the Programming Language Design and Implementation annual meeting.

"We all know that energy consumption is a big problem," said author Luis Ceze, a UW assistant professor of computer science and engineering. "With our system, mobile phone users would notice either a smaller phone, or a longer battery life, or both. Computing centers would notice a lower energy bill."

The basic idea is to take advantage of processes that can survive tiny errors that happen when, say, voltage is decreased or correctness checks are relaxed. Some examples of possible applications are streaming audio and video, games and real-time image recognition for augmented-reality applications on mobile devices.

"Image recognition already needs to be tolerant of little problems, like a speck of dust on the screen," said co-author Adrian Sampson, a UW doctoral student in computer science and engineering. "If we introduce a few more dots on the image because of errors, the algorithm should still work correctly, and we can save energy."

The UW system is a general framework that creates two interlocking pieces of code. One is the precise part – for instance, the encryption on your bank account's password. The other portion is for all the processes that could survive occasional slipups.

The software creates an impenetrable barrier between the two pieces.

"We make it impossible to leak data from the approximate part into the precise part," Sampson said. "You're completely guaranteed that can't happen."

While computers' energy use is frustrating and expensive, there is also a more fundamental issue at stake. Some experts believe we are approaching a limit on the number of transistors that can run on a single microchip. The so-called "dark silicon problem" says that as we boost computer speeds by cramming more transistors onto each chip, there may no longer be any way to supply enough power to the chip to run all the transistors.

The UW team's approach would work like a dimmer switch, letting some transistors run at a lower voltage. Approximate tasks could run on the dimmer regions of the chip.

"When I started thinking about this, it became more and more obvious that this could be applied, at least a little bit, to almost everything," Sampson said. "It seemed like I was always finding new places where it could be applied, at least in a limited way."

0 comments:

  © Blogger templates Psi by Ourblogtemplates.com 2008

Back to TOP