7/23/2010 8:28:00 AM
Recall the first personal computers, released back in the mid-1980s. The adjectives “clunky” and “slow” may come to mind. Since their debut, computer scientists have been competing to consistently release sleeker computers and software applications with increasingly faster performance.
But Illinois computer science professor Gul Agha believes it’s time to shift away from exclusively focusing on increasing performance and instead make energy-efficiency a priority. Agha said studies have found that computers use anywhere between 2 and 13 percent of total energy consumption.
“In the past, developers didn’t consider energy-efficiency. They just wanted the best performance, the fastest computers, the fastest algorithms,” Agha said. “Now computers often have the capability to perform faster than needed, which consumes unnecessary energy.”
Agha’s goal is to discover and use the correct number of cores in a multi-core processor to minimize energy consumption and maximize computer performance for a given parallel algorithm. In computing, a “core” is a small part of a processor that reads and executes computer instructions. Several cores in parallel can be run at a slower frequency than a single core so that energy is consumed at a constant rate.
“Roughly speaking, energy consumed by a computation is proportional to the square of the rate at which a core runs, while performance varies as a straight line,” Agha said. “However, more cores require greater interaction of separately executed parts of a parallel algorithm, which may slow performance as well as increase energy consumption.
Agha and Vijay Korthikanti, a Ph.D. student in computer science working with Agha, have developed a methodology to analyze parallel algorithms to determine how many cores and at what rate they should be run so that energy consumption can be minimized without a user noticing a visible decrease in performance.
He noted that different computer applications, for example Adobe Photoshop or Microsoft Word, require varying amounts of energy and he keeps this in consideration when making modifications.
“We do a theoretical analysis and learn how much energy is needed for specific algorithms within different applications,” he said.
Agha has found that modifying the number of cores within the algorithms can save a considerable amount of energy and believes that new, energy-efficient algorithms will contribute to the future of multi-core processing.
“The number of cores in PCs is expected to double every two years or less. Current experimental versions of personal computers have as many as 48-core machines,” Agha said. “We’re in the experimental stage. It will be very interesting to see how this is developed in the next three to five years.”
So far, Agha has analyzed and modified only existing algorithms. Next, he hopes to design algorithms that use less energy but don’t lose much performance compared to current algorithms.
Agha and Korthikanti’s papers on the topic have appeared in ACM SPAA 2010, Hot Par 2010, ICPP 2009 and in the forthcoming First International Conference of Green Computing.
“It’s the first time in my career of 27 years as a computer scientist that I could refer to a Greenpeace whitepaper in a research article of mine,” Agha joked.