April 04, 2022 - by Simone Ulmer

When University of Basel computer science professor Florina Ciorba talks about her work, her eyes light up and her voice resonates with enthusiasm. It is both the optimisation of computer codes as well as the collaboration with scientists from diverse disciplines that excites the passionate computer scientist. Ciorba, who was born and raised in Romania, has been especially fascinated by astrophysics and cosmology for about five years, which came about through a call for new PASC projects.

PASC — the Platform for Advanced Scientific Computing — has existed since 2013 and aims to make computer applications fit for future exascale-class systems through interdisciplinary cooperation between scientists, computer scientists, mathematicians and hardware manufacturers. This is exactly what Ciorba says she has always wanted: collaborating with other disciplines and helping scientists to improve their codes so they can get the most out of the supercomputers. How else are scientists going to learn how to use the state-of-the-art methods from the field of high-performance computing (HPC)? "This is where we are really needed, and it takes me and my team out of our computer science bubble," says the computer scientist.

Bypass approximation methods

The numerical simulations performed in astrophysics and computational fluid dynamics that reproduce the behaviour of a fluid or plasma are among the most computationally intensive calculations in HPC. The reason for this is the complex physics of the non-linear systems as well as the different size scales that need to be modelled. With the computational resources currently available, the computational scale, i.e. the resolution of the simulation and/or the dimension, must be reduced; or, the physics involved must be mapped using approximation methods, writes the interdisciplinary team led by Ciorba, Lucio Mayer, Professor of Astrophysics at the University of Zurich, and Rubén Cabezón, Senior Scientist at University of Basel in their PASC project proposal. However, how these limitations influence the results is not yet sufficiently understood.

The researchers hope to eventually achieve realistic simulations with up to a trillion particles using optimised new codes and the future exascale computer architectures, which will be able to perform 1018 computational operations per second. In Ciorba, Mayer, and Cabezón's PASC project, the researchers use the Smoothed Particle Hydrodynamics (SPH) technique to solve the hydrodynamic equations. With this empirical method, there is no fixed grid point, so the coordinates move with the fluid being calculated. Although this simplifies the simulation of the system, it also makes the calculations less efficient: since the coordinates are not fixed, the system to be simulated is subject to constant changes, as a result of which it is difficult to parallelise the calculations on the computer architectures.

A mini-app for optimisation

To increase efficiency in their first SPH-EXA-PASC project (2017 to 2021), the team led by Ciorba, Mayer, and Cabezón developed a mini-app — a slimmed-down version of a real application. Ciorba originally assumed that she and her team would optimise codes, but at the first meeting came the surprise: for legacy and code-complexity reasons, the astrophysicists did not want their code to be optimised. "They wanted us to develop a mini-app from scratch based on the parent codes that would help optimise their codes. The parents of this mini-app are the simulation codes ChaNGa and SPHYNX," says Ciorba.  ChaNGa is used by Lucio Mayer and the SPHYNX code is used by Rubén Cabezón.

The fact that things turned out differently than expected does not dampen Ciorba's enthusiasm for the project. "It is such a wonderful collaboration and rewarding for both sides," the computer scientist says. "What I personally appreciate is being able to ask questions and learn new things, like what stars are all about, or how the Ligo laser interferometer detects gravitational waves." Sometimes she feels like a little kid in a candy shop, she adds with a laugh.

The team is already working on a follow-up project, the funding for which has been secured through PASC until 2024. With the completed state-of-the art mini-app, which uses the latest parallel programming and software engineering techniques, it has been possible to accelerate the simulation codes on both CPUs and graphics processors (GPU) so that the calculations reach their goal more quickly. In the next step, the SPH-EXA mini-app will be transformed into a completely new production code by including additional physics so that it simulates real systems, such as the formation, growth and merging of supermassive black holes or the formation of planets, realistically and in high resolution. 

The biggest challenges in performing these calculations are fault tolerance, adaptive load balancing, self-scheduling, and adaptive time-stepping. Adaptive time-stepping makes it possible to "automatically" zoom in on certain areas to obtain a higher resolution of the physical processes. Solving the problem of load balancing in general in all codes is a dream of hers, Ciorba admits, so that the code automatically distributes the calculations among the computer nodes in such a way that no processor must wait for another. Fortunately, self-scheduling software, developed by her and her team, takes care of solving this problem.  

"But how can we ensure that false physical phenomena are not calculated?", asks Ciorba. During the simulation, alpha particles from the cosmic radiation could travel through the computer and cause a bit flip that changes the number of representations used for the calculation. Her postdoctoral researcher Aurelian Cavelan suggested the method of selective particle replication for this, which replicates a certain number of the particles at certain time intervals on the neighbouring processor and thus checks the correctness of the determined values on the other processor.

A lifelong passion

These topics of fault tolerance and scheduling have occupied Ciorba since her time as a postdoc in Dresden. But it really all started when she finished 8th grade, three years after the revolution in Romania. Computers had suddenly become part of the educational programme. Physics and maths were her passion, and computer science aroused her curiosity — a curiosity that has driven her ever since. She ended up pursuing her education in computer science and, by chance through an Erasmus programme, studying in Greece. She wrote her master's thesis there before returning to complete her degree in Romania.

Back in Greece for her PhD, she became interested in parallelisation, scheduling and load balancing. During her research, she came across Ioana Banicescu, a professor of computer science at Mississippi State University in the USA. Ciorba was excited about this professor’s work, so when the two met at a conference, it was clear that she took the opportunity to join the research group of Mark Horstemeyer, a materials science professor at the Mississippi State University to work with Banicescu during her time as a postdoc. This also sealed the fate and deviated her from the original plan to become a high school computer science teacher in Romania.

Ciorba never felt that she was at a disadvantage as a woman in a male-dominated field like computer science — we are all brains, after all, men and women, and should do what we enjoy professionally. She always tells her daughter that the only limits are the ones she sets for herself. "And yes, other people could put more pressure on you and tell you not to do this, to do that. But if you believe them, you are limiting yourself. "

Production code made in Switzerland 

This attitude seems to have contributed to Ciorba’s success. At only 43 years old, she has accomplished one of her biggest goals: to work in an interdisciplinary environment and to make contributions to computer science and HPC to support interdisciplinary science. "The PASC project is unique in the sense that we co-developed an SPH framework with the mini-app, which is now evolving to be the first production code made in Switzerland that can run the trillion-particle simulation. " 

The team is convinced that their new simulation code underpins Switzerland's outstanding position in experimental physics and observational astronomy. For example, Ciorba and Mayer are involved in the Swiss consortium for the Square Kilometre Array Observatory (SKAO): SPH-EXA will be used to simulate what the telescopes will observe. The simulations then will help to analyse the huge amounts of data collected by the telescopes. In addition, the method makes it possible to simulate planet formation with high-resolution models, as Lucio Mayer and his team do, or the core collapse of dying stars and their explosion (supernovae), Rubén Cabezón's field of expertise. Finally, the new production code could also be used more generally in the simulation of fluid-dynamic processes, for example in engineering sciences.