The Laboratory for Scientific Computing and Modelling (LSM), which you now lead, was newly founded at the beginning of 2018 as a consolidation of several different research groups. What are your hopes for the laboratory?
LSM is PSI's competence centre for scientific problems with a focus on theory, modelling, and high-performance computing. Through synergies, we can now better interconnect and exploit the enormous knowledge that resides in each of the six research groups.
What do you simulate and model at the large-scale research facilities of PSI?
The big particle accelerators at PSI need to be simulated all the way from the particle sources to the experiments. We would like to know about every single particle with the utmost precision, and we are especially interested in the small proportion of particles that get lost in the acceleration process or don't make it to the experiment in the desired quantity. Simulations and modelling play a role not only in the construction of new accelerators, but also in their further development and optimisation. We also use numerical models for simulation of the instruments and evaluation of the data. One big challenge is the volume of data generated: This will increase enormously, especially with SwissFEL and the experiments at the planned SLS 2.0. All the competences of our laboratory's expert groups are in demand.
What else is simulated in your laboratory?
For example, how defects such as cracks in materials propagate. Also, materials that might find application in future quantum computers are being investigated in our laboratory. In another project, we simulate receptors that are coupled to proteins in cells. They function more or less like telephone lines in the body's communication network.
With such insights into phenomena of the materials and biological sciences, we support the experimental research at PSI. This helps researchers develop more efficient experiments, or to find inspiration in model computations. Simulation is, next to theoretical and experimental research, the third pillar of science.
How would you describe the pathway from your model to the simulation result and applications?
The model is a reproduction of reality, the physics in mathematical equations. It sets the boundary conditions for a simulation. When we generate the model, we first decide which factors we want to ignore, so it will not be too complicated to compute later. Ultimately we want to make sure that a simulation will finish running before I retire. (smiling)
For an accelerator, for example, one of the first decisions is whether or not the particles flying around will collide in the model. If I don't factor in the collisions, I leave out part of the reality – I create a simplified version of reality, the model. We reproduce the simplified basic equations obtained in this way on the computer by writing a program.
Then we begin the simulation with certain starting conditions – for example, we might set the starting positions and velocities of particles in the computer program. The end states of the particles, after they have flown through the accelerator, are our result. In the case of proton therapy, such computations contribute to the ability to precisely determine the radiation dose for each individual patient.
So you can never represent the whole reality. What other uncertainties can occur in a simulation?
For example, numerical artifacts can arise because, in the computer, numbers with decimal places can only be represented finitely, so in principle real numbers with an infinite number of decimal places have to be rounded off. Also, errors in the computer program can't be ruled out. We are not able to write a program that can thoroughly check the correctness of another program.
Also, there are many equations that cannot be solved exactly. However, if we specify in advance the error to be tolerated, we can get around this problem. That's the fascinating thing about the field of numerical mathematics.
Interview: Paul Scherrer Institute/Christina Bonanati