Georgia Tech researchers are looking to cut by up to 15 percent the amount of electricity needed to cool data centers that are becoming increasingly jammed with servers and other network gear boasting more powerful processors, Network World reports. Georgia Institute of Technology researchers are using a 1,100 square-foot simulated data center to explore airflow patterns, make temperature readings on systems, and more. Fog generators, lasers, and infrared sensors are among the tools used to visualize the best setup. According to the school, a large server cabinet produced 1 to 5 kilowatts of heat five years ago, but versions today...

Subscribe to Read More

Are You an Educator?

Get Free online access to all our
news and resources and get
eCampus News Today email newsletter

About the Author:

eSchool News