Researchers vie for supercomputer access


TeraGrid resources include more than a petaflop of computing capability.
TeraGrid resources include more than a petaflop of computing capability.

Officials who run the most comprehensive cyber-infrastructure dedicated to scientific research are accepting proposals for the next cycle of projects headed by academics who require massive computing power to predict earthquakes, detect tumors, and better understand a myriad of technological issues.

The National Science Foundation’s (NSF’s) TeraGrid program allocates more than 1 billion processor hours to researchers every year, and program officials are accepting the latest round of submissions until Jan. 15.

Last month, the committee that decides how TeraGrid’s resources will be distributed among applicants doled out about 200 million processor hours and nearly 1 petabyte of data storage to 100 research teams worldwide. A petabyte is a unit of computer storage equal to 1 quadrillion bytes, or 1,000 terabytes. A terabyte is equal to 1,000 gigabytes.

TeraGrid resources include more than a petaflop of computing capability, more than 30 petabytes of online and archival data storage, and access to 100 academic databases. A petaflop consists of a thousand trillion floating point computer operations per second and is a measure of computer performance.

Teams that win TeraGrid resources will have access to the supercomputer network from April 1, 2010 through March 31, 2011, according to an NSF release detailing submissions deadlines. Researchers can submit proposals online at https://pops-submit.teragrid.org/.

The TeraGrid committee will select winners of the supercomputer resources during its March meeting.

TeraGrid’s computer systems are the Ranger and Longhorn remote visualization and data analysis systems at the Texas Advanced Computing Center (TACC), and Kraken and Nautilus—TeraGrid’s newest machine—at the National Institute for Computational Sciences (NICS) in Tennessee.

TeraGrid computers have helped university researchers address scientific issues that require more computing power than is often available at even the largest campuses.

Officials at the Southern California Earthquake Center used supercomputers at the University of Tennessee and the University of California, San Diego to generate 3-D geological structures that give an up-close view of what happens during a 7.8 Southern California earthquake.

The project, called Shakeout, used more than 14 billion grid points and ran for about 12 hours—a much shorter time than the three days it took for a smaller simulation to be completed in 2006. The advanced computing power allowed the Earthquake Center researchers to simulate shorter buildings prevalent in Southern California. Short buildings produce higher frequencies and require more computer resources to simulate than tall buildings.

Shakeout is considered the world’s largest seismological simulation.

Shewaferaw Shibeshi, a professor at Alice Lloyd College in Kentucky, used TeraGrid resources to help scientists improve the prognosis and treatment of atherosclerosis, or the hardening of heart arteries, and the leading cause of death in developed countries.

Shibeshi’s TeraGrid computer simulations showed that certain kinds of plaque buildup in the arteries can be detected with blood tests. The supercomputer helped Shirbeshi—who began his research at Howard University—to show how blood flow and red blood cells react to different heart conditions.

A Canadian college student recently used TeraGrid-supported simulations to test the strength and durability of a range of materials, including those used in airplane landing gear.

Aaron Percival, a graduate student at Queens University in Ontario, Canada, used the Neutron Science TeraGrid Gateway at Oak Ridge National Laboratory to create a simulation that tests the structure of a material by showing its reaction to a beam of radiation or X-ray.

The first TeraGrid machines were funded by a $45 million NSF grant that led to computers complete with 11.6 teraflops, disk-storage systems with capacities of more than 450 terabytes of data, visualization systems, and massive data collections.

NSF created the Office of Cyberinfrastructure in 2005 with $150 million in awards.

Link:

TeraGrid

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

TeraGrid resources include more than a petaflop of computing capability.
TeraGrid resources include more than a petaflop of computing capability.

Officials who run the most comprehensive cyber-infrastructure dedicated to scientific research are accepting proposals for the next cycle of projects headed by academics who require massive computing power to predict earthquakes, detect tumors, and better understand a myriad of technological issues.

The National Science Foundation’s (NSF’s) TeraGrid program allocates more than 1 billion processor hours to researchers every year, and program officials are accepting the latest round of submissions until Jan. 15.

Last month, the committee that decides how TeraGrid’s resources will be distributed among applicants doled out about 200 million processor hours and nearly 1 petabyte of data storage to 100 research teams worldwide. A petabyte is a unit of computer storage equal to 1 quadrillion bytes, or 1,000 terabytes. A terabyte is equal to 1,000 gigabytes.

TeraGrid resources include more than a petaflop of computing capability, more than 30 petabytes of online and archival data storage, and access to 100 academic databases. A petaflop consists of a thousand trillion floating point computer operations per second and is a measure of computer performance.

Teams that win TeraGrid resources will have access to the supercomputer network from April 1, 2010 through March 31, 2011, according to an NSF release detailing submissions deadlines. Researchers can submit proposals online at https://pops-submit.teragrid.org/.

The TeraGrid committee will select winners of the supercomputer resources during its March meeting.

TeraGrid’s computer systems are the Ranger and Longhorn remote visualization and data analysis systems at the Texas Advanced Computing Center (TACC), and Kraken and Nautilus—TeraGrid’s newest machine—at the National Institute for Computational Sciences (NICS) in Tennessee.

TeraGrid computers have helped university researchers address scientific issues that require more computing power than is often available at even the largest campuses.

Officials at the Southern California Earthquake Center used supercomputers at the University of Tennessee and the University of California, San Diego to generate 3-D geological structures that give an up-close view of what happens during a 7.8 Southern California earthquake.

The project, called Shakeout, used more than 14 billion grid points and ran for about 12 hours—a much shorter time than the three days it took for a smaller simulation to be completed in 2006. The advanced computing power allowed the Earthquake Center researchers to simulate shorter buildings prevalent in Southern California. Short buildings produce higher frequencies and require more computer resources to simulate than tall buildings.

Shakeout is considered the world’s largest seismological simulation.

Shewaferaw Shibeshi, a professor at Alice Lloyd College in Kentucky, used TeraGrid resources to help scientists improve the prognosis and treatment of atherosclerosis, or the hardening of heart arteries, and the leading cause of death in developed countries.

Shibeshi’s TeraGrid computer simulations showed that certain kinds of plaque buildup in the arteries can be detected with blood tests. The supercomputer helped Shirbeshi—who began his research at Howard University—to show how blood flow and red blood cells react to different heart conditions.

A Canadian college student recently used TeraGrid-supported simulations to test the strength and durability of a range of materials, including those used in airplane landing gear.

Aaron Percival, a graduate student at Queens University in Ontario, Canada, used the Neutron Science TeraGrid Gateway at Oak Ridge National Laboratory to create a simulation that tests the structure of a material by showing its reaction to a beam of radiation or X-ray.

The first TeraGrid machines were funded by a $45 million NSF grant that led to computers complete with 11.6 teraflops, disk-storage systems with capacities of more than 450 terabytes of data, visualization systems, and massive data collections.

NSF created the Office of Cyberinfrastructure in 2005 with $150 million in awards.

Link:

TeraGrid

Sign up for our newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.