Georgia Tech researchers are looking to cut by up to 15 percent the amount of electricity needed to cool data centers that are becoming increasingly jammed with servers and other network gear boasting more powerful processors, Network World reports. Georgia Institute of Technology researchers are using a 1,100 square-foot simulated data center to explore airflow patterns, make temperature readings on systems, and more. Fog generators, lasers, and infrared sensors are among the tools used to visualize the best setup. According to the school, a large server cabinet produced 1 to 5 kilowatts of heat five years ago, but versions today would be closer to 28 kilowatts–and new machines could generate twice that. "Some people have called this the Moore’s Law of data centers," said Yogendra Joshi, a professor in Georgia Tech’s Woodruff School of Mechanical Engineering, in a statement. "The growth of cooling requirements parallels the growth of computing power, which doubles roughly ever 18 months." Advanced thinking from the researchers includes developing algorithms to best match dynamically shifting computer loads to the coolest machines available. The researchers also are looking at how best to use waste heat removed from the data centers…

Click here for the full story

About the Author:

eSchool News


Add your opinion to the discussion.