Posted November 28, 2017

New computer cluster pushes Temple ahead in high-performance computing

Owl’s Nest 2, backed by more than $2 million in funding, has the combined computing power of about 1,500 typical desktop computers and can be utilized by researchers across the university.

up close view of computer wiring in Owls Nest 2
Photography By: 
Richard Berger
The Owl’s Nest 2 high-performance computing cluster has the power of about 1,500 typical desktop computers combined.
Pushing Temple farther forward in using big-data and high-performance computing for research, Professor Axel Kohlmeyer and Assistant Professor Richard Berger recently unveiled a new and improved Linux cluster for scientific computing: Owl’s Nest 2.
 
Backed by more than $2 million in funding from the National Science Foundation, Pennsylvania’s Commonwealth Universal Research Enhancement Program (CURE), U.S. Army Research Laboratory and Temple’s College of Science and Technology, Owl’s Nest 2 represents a monumental step forward in computational research capability, enabling Temple to compete with the world’s leading research institutions.
 
The new system—built over nine months in collaboration with staff and researchers from across the university—is made up of several computers that work together. It has more than three times the computing power and more than 10 times the data-storage capacity of the original Owl’s Nest high-performance computing, or HPC, system, which was built six years ago and has been in heavy use since. 
 
The new cluster features improved hardware that can process more information faster and run more advanced computations as it provides the combined computing power of about 1,500 typical desktop computers.
 
 
“There is nothing typical about the computers in the high-performance computing cluster,” Berger, an assistant professor of mathematics, explained. “The average desktop may have four to 32 gigabytes of memory (RAM) with dual or quad core central processing units (CPUs), while typical machines in the new system have a minimum of 128 gigabytes of RAM and 28 CPU cores. Our new big-data nodes have up to three terabytes of RAM.” 
 
To turn all of these into a parallel computer, it takes multiple networks and nearly 1,000 cables connecting them to one another and to the 1,500-terabyte capacity parallel storage system.
 
“One of the most important initiatives of Dean Michael L. Klein here at Temple is boosting computational science,” said Kohlmeyer, a professor of mathematics. “It is crucial that Temple has a suitable high-performance computing resources for advanced research.”
 
Kohlmeyer added, “To do competitive research today, you often have to do something that is more complex and more demanding than a desktop computer can handle. We are actually getting more and more inquiries from people in nontraditional computational research areas who are using increasingly complex algorithms and larger data sets. They need HPC to do their research.”
 
Owl’s Nest 2 will benefit the whole university. Any researcher on campus with compatible HPC needs can apply for access. 
 
Facilitating research in all areas of study, Owl’s Nest 2 gives researchers on campus immediate access to services very similar to much larger national supercomputing centers, such as the Pittsburgh Supercomputing Center, Texas Advanced Computing Center in Austin or San Diego Supercomputing Center. While CST researchers are traditionally the main users of HPC, the need for this type of computing across Temple is growing rapidly. Researchers from the College of Engineering, College of Liberal Arts, College of Public Health, Fox School of Business, Lewis Katz School of Medicine and the Beasley School of Law are currently using Temple’s HPC resources. 
 
“Instead of having to rent computers elsewhere or writing a proposal and waiting to get a time slot at one of the national supercomputing centers, Temple faculty and students will now have 24/7, free access to the cluster,” Berger said.  
 
For more information, or to apply for access, visit the Owl’s Nest 2 website or contact the HPC team at hpc@temple.edu.
Hannah Amadio
Anonymous