Register |  Lost Password?
Facebook twitter Linked in

Researchers vie for supercomputer access

Scientists asked to submit TeraGrid project proposals before National Science Foundation's Jan. 15 deadline

Email Print

TeraGrid resources include more than a petaflop of computing capability.

TeraGrid resources include more than a petaflop of computing capability.

Officials who run the most comprehensive cyber-infrastructure dedicated to scientific research are accepting proposals for the next cycle of projects headed by academics who require massive computing power to predict earthquakes, detect tumors, and better understand a myriad of technological issues.

The National Science Foundation’s (NSF’s) TeraGrid program allocates more than 1 billion processor hours to researchers every year, and program officials are accepting the latest round of submissions until Jan. 15.

Last month, the committee that decides how TeraGrid’s resources will be distributed among applicants doled out about 200 million processor hours and nearly 1 petabyte of data storage to 100 research teams worldwide. A petabyte is a unit of computer storage equal to 1 quadrillion bytes, or 1,000 terabytes. A terabyte is equal to 1,000 gigabytes.

TeraGrid resources include more than a petaflop of computing capability, more than 30 petabytes of online and archival data storage, and access to 100 academic databases. A petaflop consists of a thousand trillion floating point computer operations per second and is a measure of computer performance.

Teams that win TeraGrid resources will have access to the supercomputer network from April 1, 2010 through March 31, 2011, according to an NSF release detailing submissions deadlines. Researchers can submit proposals online at https://pops-submit.teragrid.org/.

The TeraGrid committee will select winners of the supercomputer resources during its March meeting.

TeraGrid’s computer systems are the Ranger and Longhorn remote visualization and data analysis systems at the Texas Advanced Computing Center (TACC), and Kraken and Nautilus—TeraGrid’s newest machine—at the National Institute for Computational Sciences (NICS) in Tennessee.

TeraGrid computers have helped university researchers address scientific issues that require more computing power than is often available at even the largest campuses.

Officials at the Southern California Earthquake Center used supercomputers at the University of Tennessee and the University of California, San Diego to generate 3-D geological structures that give an up-close view of what happens during a 7.8 Southern California earthquake.

The project, called Shakeout, used more than 14 billion grid points and ran for about 12 hours—a much shorter time than the three days it took for a smaller simulation to be completed in 2006. The advanced computing power allowed the Earthquake Center researchers to simulate shorter buildings prevalent in Southern California. Short buildings produce higher frequencies and require more computer resources to simulate than tall buildings.

Shakeout is considered the world’s largest seismological simulation.

Shewaferaw Shibeshi, a professor at Alice Lloyd College in Kentucky, used TeraGrid resources to help scientists improve the prognosis and treatment of atherosclerosis, or the hardening of heart arteries, and the leading cause of death in developed countries.

Post to Twitter

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
1  2  Next >  

Researchers vie for supercomputer access

Email Print

TeraGrid resources include more than a petaflop of computing capability.

TeraGrid resources include more than a petaflop of computing capability.

Officials who run the most comprehensive cyber-infrastructure dedicated to scientific research are accepting proposals for the next cycle of projects headed by academics who require massive computing power to predict earthquakes, detect tumors, and better understand a myriad of technological issues.

The National Science Foundation’s (NSF’s) TeraGrid program allocates more than 1 billion processor hours to researchers every year, and program officials are accepting the latest round of submissions until Jan. 15.

Last month, the committee that decides how TeraGrid’s resources will be distributed among applicants doled out about 200 million processor hours and nearly 1 petabyte of data storage to 100 research teams worldwide. A petabyte is a unit of computer storage equal to 1 quadrillion bytes, or 1,000 terabytes. A terabyte is equal to 1,000 gigabytes.

TeraGrid resources include more than a petaflop of computing capability, more than 30 petabytes of online and archival data storage, and access to 100 academic databases. A petaflop consists of a thousand trillion floating point computer operations per second and is a measure of computer performance.

Teams that win TeraGrid resources will have access to the supercomputer network from April 1, 2010 through March 31, 2011, according to an NSF release detailing submissions deadlines. Researchers can submit proposals online at https://pops-submit.teragrid.org/.

The TeraGrid committee will select winners of the supercomputer resources during its March meeting.

TeraGrid’s computer systems are the Ranger and Longhorn remote visualization and data analysis systems at the Texas Advanced Computing Center (TACC), and Kraken and Nautilus—TeraGrid’s newest machine—at the National Institute for Computational Sciences (NICS) in Tennessee.

TeraGrid computers have helped university researchers address scientific issues that require more computing power than is often available at even the largest campuses.

Officials at the Southern California Earthquake Center used supercomputers at the University of Tennessee and the University of California, San Diego to generate 3-D geological structures that give an up-close view of what happens during a 7.8 Southern California earthquake.

The project, called Shakeout, used more than 14 billion grid points and ran for about 12 hours—a much shorter time than the three days it took for a smaller simulation to be completed in 2006. The advanced computing power allowed the Earthquake Center researchers to simulate shorter buildings prevalent in Southern California. Short buildings produce higher frequencies and require more computer resources to simulate than tall buildings.

Shakeout is considered the world’s largest seismological simulation.

Shewaferaw Shibeshi, a professor at Alice Lloyd College in Kentucky, used TeraGrid resources to help scientists improve the prognosis and treatment of atherosclerosis, or the hardening of heart arteries, and the leading cause of death in developed countries.

Post to Twitter

  • Facebook
  • Twitter
  • Delicious
  • LinkedIn
  • StumbleUpon
  • Add to favorites
  • Email
  • RSS
1  2  Next >  

You must be logged in to post a comment Login

My eCampus News provides you the latest news by the categories you select.
Customize your news now. You must be logged in to view your customized news.
Username:
Password:    
Register |  Lost Password?