internet2-research-university

Why research success might depend on a pipe dream


With an upgraded 100G connection to Internet2, research scientists at the University of Connecticut aim to remain competitive with colleagues at other R1 institutions.

internet2-research-universityIn January, the University of Connecticut upgraded its Internet2 connection from 10 gigabits per second to 100G. The connection, made possible through a partnership with the Connecticut Education Network and the state, is intended to keep the university on the cutting edge of research and better positioned to compete for grants and scientists.

“As you can imagine in today’s world of big data and research that depends upon large data sets, many schools are upgrading to 100G to accommodate their researchers’ needs,” said Paul Howell, chief cyberinfrastructure security officer for Internet2. “It’s becoming the normal type of connection for research communities.”

Today, about 280 major research universities are members of Internet2, along with approximately 45 regional networks that serve thousands of other educational organizations, including community colleges and libraries. It’s essentially a private high-capacity, high-speed network, allowing researchers to bypass the commodity internet where traffic can suffer from slowdowns and bottlenecks.

But the value of Internet2 is dependent to a certain extent on the capacity of a school’s connection to that network. Upgrading a connection from 10G to 100G is essentially like widening the on-ramp to a freeway: It allows more traffic to flow faster. Not surprisingly, the U. Conn upgrade is expected to be a boon for university departments, many of which are seeing rapid growth not only in the amount of data they handle but the compute cycles needed to analyze that data.

(Next page: Critical departments and their need for network capacity)

Among the fastest growing consumers of network capacity on campus are departments conducting medical research. Much of the work in understanding diseases and drug reactions today involves analysis of genomic data, which is data intensive in a way that biology hasn’t needed in the past.

Furthermore, much of this data is stored in national databases and shared by colleagues at institutions nationwide. “The increase in demand for bandwidth has been rather remarkable over the past five years,” said Howell of the Internet2 membership. “Researchers are depending on high-speed connections in order to be able to collaborate across the country.”

Perhaps no department at U. Conn is more bandwidth hungry than the Physics Department. Associate Professor Richard Jones is using the Thomas Jefferson National Accelerator Facility in Newport News (VA) to study the bonds that bind quarks together inside protons and neutrons. “This is nuclear physics at the particle physics frontier,” said Jones of the collaborative research project known as the Gluex Experiment. “The challenge experimentally is to handle all the data.”

By the time the experiment is working at full intensity, the research team estimates that it will generate one petabyte of data each year. And that’s just the beginning, because analyzing that data requires a whole other set of replicated data, known in research circles as simulation or Monte Carlo. “We need about three times as much Monte Carlo as we do real data,” explained Jones. “We end up needing something like 10 petabytes of data throughput produced and analyzed per year.”

Without the upgraded connection, says Jones, this kind of work would have been impossible given the broader demands for bandwidth on campus. “If I had a 10G length to myself, then I wouldn’t need the 100G connection,” he said. “But this is a shared pipe and there are a number of competing concerns that it has to satisfy, including 15,000 or so undergraduates who want to watch Netflix.”

More than slow connection issues

The disadvantages of slow connectivity extend far beyond simply delaying data analysis. From a funding standpoint, schools with poor network connectivity may increasingly find themselves relegated to the research backwaters. For his work, for example, Jones is funded by the National Science Foundation (NSF), while his colleagues are funded by the Department of Energy. Both granting agencies are now requiring data-management and infrastructure statements as part of all physics proposals. “The proposal reviewer is saying, ‘OK, if you plan to handle so much data and produce so much Monte Carlo for a year, show us that you have the infrastructure to support that,'” explained Jones.

If U. Conn had lacked that capability, said Jones, the grant would likely have gone to a different university. For R-1 schools, it’s a potentially slippery slope. Shut out of major funding and grant opportunities, top-tier researchers may well seek greener pastures, leaving the school struggling to attract new talent.

The need to stay competitive extends well beyond U. Conn’s campus, however. The 100G upgrade was implemented through a partnership with the Connecticut Education Network, which was tasked by the legislature with creating a roadmap to provide state education institutions with a connection to the nation’s research and data-exchange backbone.

“U. Conn’s connection is part of a larger strategy for the whole state,” said Jones. “Connecticut wants an education network that will give state education institutions the capability to compete in terms of preparation of the workforce for the future.”

Andrew Barbour is an editorial freelancer with eCampus News.