Cloud computing can use about 20 times as much hardware as it needs to function properly, and Massachusetts Institute of Technology (MIT) researchers believe an algorithm could be the key to more efficiency.
Many cloud-based services – once taboo in many college and university IT offices – partition servers into virtual machines that are asked to process millions of operations per second on the server’s central processing unit. This has made cloud-based servers easier to handle for schools and companies alike, but applications that require the most data-intensive functions usually means much more hardware is used.
MIT researchers presented a more efficient cloud computing system called DBSeer at the Biennial Conference on Innovative Data Systems Research, proposing solutions that could bring down the cost of cloud computing – which is often passed down to customers, including colleges – and diagnosing application slowdowns.
Led by Barzan Mozafari, a postdoc in the lab of professor of electrical engineering and computer science at MIT, the team of researchers plans to make public the algorithm central to DBSeer at the Association for Computing Machinery’s Special in June.
The algorithm is designed to help machines to learn techniques in creating accurate models of performance and demands for the most data-intensive applications in cloud networks.
Cloud computing adoption in higher education has increased among schools of all sizes over the past five years. CDW’s “State of the Cloud” annual survey, which highlights “what drives the shift to the cloud, what types of applications organizations are taking to the cloud and what benefits (beyond cost savings) they are achieving,” showed that colleges have gravitated toward the cloud for a variety of reasons.