How A Group of Souped-Up Computers is Accelerating Important Research at Stevens
From simulating human brain changes to modeling storm activity, Steven’s high-performance computing cluster is promoting research with real world implications
Simulating the impacts of complex problems, such as storm surges that have the potential to destroy vulnerable coastal areas, require serious computing power. A regular desktop PC won’t cut it. But if you combine many smaller computers into a cluster and have them work together, this system becomes powerful enough to generate new insight into basic science questions and engineering applications.
In the basement of the Williams Library on the campus of Stevens Institute of Technology is the school’s first freely accessible high-performance computer cluster working on some of the biggest and most intricate problems, from modeling storm surges to simulating the changes a human brain undergoes as it ages.
Nicknamed Dorothy, in reference to Dorothy Vaughn, an African-American mathematician who pioneered advanced computing efforts at NASA, this high-performance computing cluster has quickly become an essential research tool since it came to campus in late 2018, according to several faculty at the Charles V. Schaefer, Jr. School of Engineering & Science. The computing cluster was formed from 80 compute nodes, with a total of 880 cores—which are individual compute units. This is similar to using 220 laptop computers in parallel. So far, about 20 faculty and lab groups on campus (and growing) have accessed the computer to perform research that has important implications in a variety of fields.
“It would be very difficult to perform a lot of research if we didn’t have this computational resource on campus,” said Johannes Weickenmeier, an assistant professor in the Department of Mechanical Engineering. “You would either have to buy a very expensive, yet still significantly smaller, machine for yourself; or you would have to pursue extramural funding to access off-campus resources, which can be challenging if you have only very little preliminary data and are trying to explore a new research idea.”
Mid- to large-scale computing clusters offer many advantages over a small computer including large memory storage, scalability, and faster processing times. They can work on computationally expensive tasks such as simulating and modeling complex situations while often using gargantuan amounts of data. Specifically, these complex situations encompass weather forecasting, molecular modeling, quantum mechanics, and many other research fields pursued at Stevens.
“For research in the Davidson lab, high performance computing is as important as our wave/towing tank, research vessels, and drones,” said Muhammad Hajj, George Meade Bond Professor, chair of the Department of Civil, Environmental and Ocean Engineering, and Director of the Davidson Laboratory. Hajj’s research areas cover natural disasters, air and marine vehicles, and energy. “It involves the simulation of complex physical phenomena, which requires significant computing power and resources,” said Hajj.
Weickenmeier himself uses the cluster to perform computational simulations of human brains as they shrink due to age or change shape if they are beset by neurodegenerative diseases. He uses big datasets in his work.
“And that's just something that's computationally expensive,” said Weickenmeier.
The existence of the cluster came about after Weickenmeier was first appointed to Stevens in 2018. He lobbied administrators for the machines because he thought it would be important to have these computers on campus.
Yahoo donated the computers in late 2018, Weickenmeier said. At the time, they were 10-year-old retired hardware, but have quickly proven their mettle among researchers.
“We didn’t have this before and now it’s become a popular tool on campus,” said Weickenmeier.
Reza Marsooli, assistant professor in the Department of Civil, Environmental and Ocean Engineering, has been a heavy user of the campus cluster. Marsooli’s lab concerns itself with hurricane storm surges, flood hazard assessment and mitigation, and other ocean and coastal related research. “Our work is in the field of ocean sciences and the scale of the ocean is very large scale, thousands of miles,” said Marsooli.
When it comes to computational numerical simulations, it means that his lab needs to cover a large area like the Atlantic Ocean, said Marsooli. Another factor his lab needs to consider: When it comes to studying waves and storm surges, these natural phenomena start a long distance away from shore and then they travel all the way to the coast.
Due to the scale of the ocean and coastal area, his numerical models need to cover a large-scale domain, Marsooli said. These numerical models are time-intensive and difficult to construct. Hence, high-performance computer clusters are vital to his research.
In addition, an important part of Marsooli’s research is studying climate change, whose impacts are uncertain.
“Projections of the climate in 30 years contain large uncertainties,” said Marsooli. “Using this high-performance computer cluster on campus gives us clues on how climate impacts on flood hazards may look like and what steps we can take to prepare or mitigate for their impacts.”
Student researchers who wish to access the computing cluster can reach out to Weickenmeier for details.
Visit the high performance computing web page for more information about Dorothy.