Tech advances help make supercomputers more accessible

“When supercomputing equipment prices … more affordable, it is almost always true that it is no longer seen as supercomputing, and that other computers have greatly surpassed it in capability,” said Jim Ferguson, director of education, outreach, and training for the National Institute of Computational Sciences at the University of Tennessee, home to one of the world’s premiere supercomputers.

Ferguson said “anyone who has a cheap laptop computer today” has a more powerful machine than the supercomputer he first worked on when he entered his field in 1987.

Making supercomputers more available for colleges and universities has largely depended on federal funding, especially to high-performance computers listed on the Top 500, a list of the world’s most powerful supercomputers that includes machines from the University of Tennessee and the University of Colorado, among other U.S. campuses.

There were 43 universities in the Top 500 list when it was first published in 1993. As of 2009, there were more than 100 U.S. universities on the prestigious list.

The National Science Foundation’s TeraGrid program, which allocates more than 1 billion processor hours to researchers annually, helps support advanced research projects—many from higher education—that, for example, predict earthquakes and detect tumors with unprecedented accuracy.

And research from a group of academics published in the Journal of Information Technology Impact shows a strong correlation between consistent funding of supercomputers and research competitiveness, meaning that colleges and universities that apply for and receive federal supercomputing grants often receive more research proposals than their peers.

TeraGrid last year made available 200 million processor hours and nearly 1 petabyte of data storage to 100 research teams worldwide. A petabyte is a unit of computer storage equal to 1 quadrillion bytes, or 1,000 terabytes. A terabyte is equal to 1,000 gigabytes.

TeraGrid resources include more than a petaflop of computing capability, more than 30 petabytes of online and archival data storage, and access to 100 academic databases.

For some sense of scale, a petaflop consists of a thousand trillion floating point computer operations per second and is a measure of computer performance.

Shrinking the often-bulky supercomputer has helped one campus make room for more high-performance machines.

"(Required)" indicates required fields