In 1964, the Control Data Corporation delivered a mainframe computer, the CDC 6600, to the Lawrence Radiation Laboratory in California for high energy nuclear physics research. The CDC 6600 was the world’s fastest computer at the time, firing the metaphorical start pistol for the ongoing race to create the most powerful supercomputers.
Since then, universities around the world have embraced High Performance Computing (HPC) to conduct research in fields such as genomics, proteomics, computational chemistry, molecular dynamics, bioinformatics and more.
In order to conduct effective research that leads to new cures for diseases, increased safety measures in advance of natural disasters, and more, university researchers need to dig into the hordes of data collected during these experiments and draw actionable insight.
Related content: 7 ways AI will shape the future of work
To date, there are 20-30 academic institutions in the United States that have built or developed supercomputers for research. But whether to stay ahead of other universities in research efforts or gain valuable insight into solving the world’s future problems, academic institutions large and small are seeing immediate benefits from HPC.
- 3 reasons microcredentials are poised to go mainstream - June 27, 2022
- How Georgia State engages students in every part of esports - June 24, 2022
- How to block campus security threats now–and in the future - June 22, 2022