Cloud computing can use about 20 times as much hardware as it needs to function properly, and Massachusetts Institute of Technology (MIT) researchers believe an algorithm could be the key to more efficiency.
Many cloud-based services – once taboo in many college and university IT offices – partition servers into virtual machines that are asked to process millions of operations per second on the server’s central processing unit. This has made cloud-based servers easier to handle for schools and companies alike, but applications that require the most data-intensive functions usually means much more hardware is used.
MIT researchers presented a more efficient cloud computing system called DBSeer at the Biennial Conference on Innovative Data Systems Research, proposing solutions that could bring down the cost of cloud computing – which is often passed down to customers, including colleges – and diagnosing application slowdowns.…Read More