Skip directly to content

Exascale Computers Set To Produce A Quintillion Of Calculations Per Second

on February 22, 2018 - 9:17am

Exascale Computer Project Data and Visualization area lead Jim Ahrens is a 20-year veteran of National Alamos National Laboratory. Photo by Maire O'Neill/

Danny Perez came to Los Alamos National Laboratory 11 years ago as a postdoc and stayed. He is a member of the Exascale Atomistics for Accuracy, Length and Time project team. Photo by Maire O'Neill/


Los Alamos Daily Post

When thinking about the fastest supercomputers available today solving problems at the petascale, a quadrillion calculations per second, it is amazing to realize what has been achieved in the world of computer science.

Now comes the Exascale Computer Project (ECP), which was launched in 2016 as a collaboration between the Department of Energy’s Office of Science and the National Nuclear Security Administration involving six core national laboratories including Los Alamos National Laboratory.

Exascale takes supercomputers to the next level where they will run at quintillions of calculations per second enabling them to deliver more computational science and data analytical power than existing systems such as the Titan at Oak Ridge National Laboratory or the Sequoia at Lawrence Livermore National Laboratory.

While the goal is to launch a U.S. exascale ecosystem by 2021, exascale is part of the normal work day for Jim Ahrens and Danny Perez right here at LANL. Ahrens is the lead for the ECP Data and Visualization area which includes projects in data storage, data management and visualization. Perez is a member of the Exascale Atomistics for Accuracy, Length and Time (EXAALT) project team which specializes in molecular dynamics simulations of materials.

The two men explained their work to the Los Alamos Daily Post last week.

“We try to produce something like a movie of what the atoms are doing in time. What’s limiting when we do these simulations is that we have to take very tiny steps. So even though it might take us a day of human time to do the simulation it might correspond to only a nanosecond of dynamics down at the atomic scale,” Perez said. “When we say simulation time, it’s the physical time that corresponds to how much we simulate.”

He said if you run a molecular simulation code, the way it is now, you can use a large computer to simulate a large system but only for a short time. He said the project wants to overcome this limitation so that they will be able to simulate small systems over very long times.

“There’s a huge gap that we want to be able to completely fill, so that somebody could say, ‘I want to simulate a largish system over a long time’, and we would still be able to do it,” Perez said.

He said in his field because only very short simulations are possible, it is hard to compare simulation results to experimental results.

“One of the things we want to be able to do at exascale is get into a time scale regime where you could on an equal footing compare a simulation and a measurement and be able to make a direct comparison,” Perez said. “Right now we’re a factor of a million off in time scales so a lot can happen when you try to extrapolate that much. The goal is not to have to extrapolate that much and be able to run the right experimental conditions.”

Ahrens said one of the things that’s new about exascale is that all developed software will be packaged up together and be available as a collection. This “exascale software stack” will be available to scientists to use and create their own exascale applications. He said Perez’s project is one of 25 exascale applications that include projects in wind, accelerators, cosmology, supernovas, DNA and cancer.

“The reality is, when we get to these new exascale machines, we are increasing orders of magnitude both in our computing power as well as the size of data generated, when compared to the supercomputers of today.  One significant challenge is storage technology trends are not accelerating as fast as computing trends. We will need to reduce the data generated at exascale for a storage system that is only slightly bigger than the size and speed of a petascale supercomputer.” Ahrens said. “Projects in my portfolio focus on storage systems, visualization and data reduction to help scientists store, retrieve and understand their simulation results.” 

Ahrens said an important part of his portfolio in ECP is a checkpoint restart project. “This makes sure that when the simulation is running for weeks at a time, if the computer fails, a scientist can restart the simulation from a recently saved result. This task requires a dependable storage system.” he said.

“People kind of take reading and writing data for granted but think of it at scale. When you save your Word document on your laptop and read it back in the next day, it contains the information you saved and is not corrupted. At exascale, we support the same functionality – except there are thousands and thousands of processors writing data to many storage units over a complicated messaging network. Yet when a scientist comes in the next day, their data is there and they can do their science,” Ahrens said.

What drives someone in a scientific field like this? Perez said his background is physics.

“I started very early on, even as an undergraduate, doing internships in computational physics and I discovered I really like computers and hanging out with nerds and building stuff out,” he said. “It’s an interesting feeling to be able to build a code from your own hands from scratch and do some real science with it.”

He said what he likes mostly to do is to build tools to enable science.

“Doing science by downloading code from the internet and running it is not the thing that drives me. Being able to build something that will allow people to do simulations that were just impossible to do before, that’s the kind of thing I like to do,” Perez said. “I also want to be able to keep up with the latest, biggest machines out there and have a chance to actually run this kind of hardware and letting people have access to it is part of the thrill”.

He said at these scales, you can’t just take whatever code you wrote five years ago and hope that you’ll be able to run it on the latest machines. The codes have to be designed from scratch in the frame of mind that you’re going to run them on hundreds of millions of cores on these machines.

A native of Montreal, Canada, Perez came to LANL 11 years ago as a postdoc and stayed. He met his wife, Ping Yang, a computational chemist at the Lab. Perez received his Master’s degree and PhD in physics at the University of Montreal. The couple likes to hike and travel during their time off but Perez admits that his work stays on his mind.

“It is hard to get these challenges out of your mind. If you’re not in front of the machine typing it’s always something that you have in the back of your mind,” he said.

Ahrens said the thing that gets him out of bed in the morning is helping scientists do science.

“As a visualization and data person, I get to see the whole breadth of science at the laboratory – the magnet laboratory, weapon physics, climate modeling and much more and it’s really cool. There’s a lot of great science going on at the laboratory and being a part of it motivates me,” he said.

Ahrens said his background story is that at the age of 11, he was really lucky because his father worked at a game manufacturer and would bring home early electronic games and computers for him to play with and test.

“I would play with them and write my own games, so I’ve been writing code for a long time. So it wasn’t a huge leap to go from there to visualization, showing pictures of the scientific data,” he said.

While in graduate school, Ahrens came to the Lab for the summer and worked in a visualization group and has basically been in Los Alamos for the two decades since.

“I started a visualization project to handle large scale data called Paraview, which is currently used around the world. Now I’m the exascale data and visualization lead,” he said.

Ahrens received his computer science degrees from the University of Massachusetts and the University of Washington.

Early in his career he worked a summer at a tera-scale supercomputer manufacturer called Thinking Machines Corporation.

“That was the best supercomputer ever. A lot of people reminisce about that machine. It was the last supercomputer that came with documentation,” he chuckled.

Ahrens’ wife, Christine Sweeney, is a staff scientist with the Computer Computational and Statistical Sciences Division. The couple has four sons.

In the future, ECP is expected to provide the computing ecosystem necessary for developing clean energy systems, improving the resilience of our infrastructure, designing new materials that can perform in extreme environments, adapting to changes in the water cycle, developing smaller and more powerful accelerators for use in medicine and industry, and much more.

Exascale systems will be used to accelerate research that leads to innovative products, job creation and an increase in competition and productivity in the industrial sector. ECP teams are also developing new applications for supporting the NNSA Stockpile Stewardship Program.