In 1964, the Control Data Corporation delivered a mainframe computer, the CDC 6600, to the Lawrence Radiation Laboratory in California for high energy nuclear physics research. The CDC 6600 was the world’s fastest computer at the time, firing the metaphorical start pistol for the ongoing race to create the most powerful supercomputers.
Since then, universities around the world have embraced High Performance Computing (HPC) to conduct research in fields such as genomics, proteomics, computational chemistry, molecular dynamics, bioinformatics and more.
In order to conduct effective research that leads to new cures for diseases, increased safety measures in advance of natural disasters, and more, university researchers need to dig into the hordes of data collected during these experiments and draw actionable insight.
Related content: 7 ways AI will shape the future of work
To date, there are 20-30 academic institutions in the United States that have built or developed supercomputers for research. But whether to stay ahead of other universities in research efforts or gain valuable insight into solving the world’s future problems, academic institutions large and small are seeing immediate benefits from HPC.
- Is unbundling the future of higher education? - May 16, 2022
- Online students struggle with sense of belonging during COVID-19 - May 13, 2022
- 4 questions to frame your job search - May 11, 2022