AbacusSummit Simulations to extract futuristic data on the universe


A recently launched set of cosmological simulations is arguably the largest ever generated, with around 60 trillion particles clocking in.

The AbacusSummit suite includes hundreds of simulations of how gravity shaped the distribution of dark matter in the universe. Here, a snapshot of one of the simulations is shown at different zoom scales: 10 billion light-years in diameter, 1.2 billion light-years in diameter, and 100 million light-years in diameter. The simulation reproduces the large-scale structures of our universe, such as the cosmic web and colossal clusters of galaxies. Image Credit: The AbacusSummit Team; layout and design by Lucy Reading-Ikkanda / Simons Foundation.

The simulation suite, called AbacusSummit, will influence the mining secrets of the universe from imminent studies of the cosmos, according to its inventors. They feature AbacusSummit in numerous articles published in the October 25 issue of Monthly notices from the Royal Astronomical Society.

AbacusSummit was created by scientists at the Flatiron Institute’s Center for Computational Astrophysics (CCA) in New York City and the Center for Astrophysics | Harvard and Smithsonian. Composed of more than 160 simulations, it represents how gravitational attraction makes the particles of a box-shaped universe travel. Such representations, known as N-body simulations, record the behavior of dark matter, which makes up the majority of matter in the universe and interacts only through gravity.

This suite is so large that it probably contains more particles than all of the other N-body simulations that have ever been run combined, although this is a difficult claim to confirm.

Lehman Garrison, lead author of the study (one of the new studies) and researcher, Center for Computational Astrophysics, Flatiron Institute

Garrison led the creation of the AbacusSummit simulations with graduate student Nina Maksimova and professor of astronomy Daniel Eisenstein, both of the Center for Astrophysics. The simulations were performed on the US Department of Energy’s Summit supercomputer at the Oak Ridge Leadership Computing Facility in Tennessee.

Soon, AbacusSummit will come in handy, as many surveys will generate maps of the cosmos with unparalleled detail in the near future. These include the Nancy Grace Roman Space Telescope, the Dark Energy Spectroscopic Instrument, and the Euclid spacecraft. One of the goals of these big budget missions is to improve estimates of the astrophysical and cosmic parameters that determine how the universe acts and how it appears.

The researchers will come up with these improved estimates by comparing the new observations to computer simulations of the universe with various values ​​for different parameters, such as the characteristics of the dark energy that separates the universe. Along with the improvements presented by next-generation surveys comes the need to improve simulations, Garrison says.

Galaxy surveys provide extremely detailed maps of the universe, and we need equally ambitious simulations that cover a wide range of possible universes in which we could live. AbacusSummit is the first suite of such simulations that has the scale and fidelity to compare to them. amazing observations.

Lehman Garrison, lead author of the study (one of the new studies) and researcher, Center for Computational Astrophysics, Flatiron Institute

The project was overwhelming. N-body calculations – which attempt to determine the motions of objects, like planets, interacting gravitally – have been among the main challenges in physics since the days of Isaac Newton. They are complicated because each object interacts with all other objects, regardless of their distance. This means that as more objects are added, the number of interactions increases rapidly.

There is no common solution to the problem of N bodies for three or more huge bodies. The calculations presented are only approximations. A typical method is to freeze time, calculate the total force acting on each object, and then push each according to the net force it experiences. The time is then slightly advanced and the process is repeated.

Using this method, AbacusSummit tackled massive particle numbers through smart code, new digital technique, and loads of computing power. The Summit supercomputer was the fastest in the world when the team performed the calculations.

The researchers designed their codebase, called Abacus, to take full advantage of Summit’s parallel processing power, thanks to which many calculations can be done at the same time. Summit boasts of numerous graphics processing units (GPUs) that work well in parallel processing.

The work of N-body calculations using parallel processing requires careful algorithm design because an entire simulation requires a significant amount of memory to store. This means that Abacus cannot just make copies of the simulation for the different supercomputer nodes to work on. Therefore, the code instead divides each simulation into a grid.

A preliminary calculation provides a reasonable approximation of the impacts of distant particles at any point in the simulation. (Distant particles have a much smaller role to play than adjacent particles.) Abacus then groups adjacent cells and divides them so that the computer can work on each group independently, integrating the approximation of the distant particles with precise calculations of adjacent particles.

For huge simulations, the team found that the Abacus method provides a substantial improvement over other N-body code bases, which divide simulations irregularly based on particle distribution.

The even divisions used by AbacusSummit make good use of parallel processing, the scientists explain. Additionally, the consistency of the Abacus grid method allows much of the distant particle approximation to be calculated before the simulation even begins.

By design, Abacus can refresh 70 million particles per second per node of the Summit supercomputer (each particle models a cluster of dark matter with 3 billion times the mass of the sun). The code can even examine a running simulation, looking for patches of dark matter suggesting the bright star-forming galaxies that are the subject of imminent investigation.

Our vision was to create this code to provide the simulations necessary for this particular new brand of galaxy investigation. We wrote the code to do the simulations much faster and more accurately than ever.

Lehman Garrison, lead author of the study (one of the new studies) and researcher, Center for Computational Astrophysics, Flatiron Institute

Eisenstein, who is a member of the Dark Energy Spectroscopic Instrument partnership – which recently began its investigation to map an exceptional fraction of the universe – expresses that he wants to use Abacus in the future.

Cosmology is taking a leap forward thanks to the multidisciplinary fusion of spectacular observations and advanced computing. The coming decade promises to be a wonderful time in our study of the historical sweep of the universe.

Daniel Eisenstein, Professor of Astronomy, Center for Astrophysics, Harvard & Smithsonian

The other co-creators of Abacus and AbacusSummit are Philip Pinto of the University of Arizona, Sihan Yuan of Stanford University, Sownak Bose of the University of Durham in England and researchers at the Center for Astrophysics Thomas Satterthwaite, Boryana Hadzhiyska and Douglas Ferrer. The simulations were performed on the Summit supercomputer as part of an Advanced Scientific Computing Research Leadership Computing Challenge allocation.

Journal reference:

Maksimova, NA, et al. (2021) ABACUSSUMMIT: a massive set of high precision and high resolution N-body simulations. Monthly notices from the Royal Astronomical Society. doi.org/10.1093/mnras/stab2484.

Source: https://www.simonsfoundation.org

Leave A Reply

Your email address will not be published.