Last calendar month , a team of investigator put the then - fastest supercomputer in the world to work on a rather declamatory dilemma : the nature of the universe ’s atomic and dark matter .

The supercomputer is called Frontier ; a team of researchers recently used it to execute the expectant astrophysical simulation of the macrocosm yet . The supercomputer ’s simulation size of it corresponds to surveys take by large telescope observatories , which to this point had not been possible . The calculations undergird the simulations offer a new foundation for cosmological simulations of the universe ’s matter content , from everything we see to the inconspicuous stuff that only interacts with ordinary matter gravitationally .

What exactly did the Frontier supercomputer calculate?

Frontier is an exascale - class supercomputer , capable of run a quintillion ( one billion - billion ) computation per second . In other words , a juiced simple machine worthy of the immense undertaking that is simulating the physics and evolution of both the known and unidentified universe .

“ If we want to bonk what the population is up to , we need to simulate both of these things : gravity as well as all the other aperient including hot gas , and the formation of stars , black holes and galaxies , ” said Salman Habib , the division director for computational science at Argonne National Laboratory , in an Oak Ridge National Laboratoryrelease . “ The astrophysical ‘ kitchen sink ’ so to speak . ”

The issue we know about — the poppycock we can see , from black holes , to molecular cloud , to planets and moon — only accounts for about 5 % of the universe ’s content , according to CERN . A more sizable lump of the macrocosm is only deduce by gravitative effects it seems to have on the seeable ( or atomic ) matter . That inconspicuous clump is called sorry matter , a catch - all term for a identification number of particles and objects that could be responsible for for about27 % of the universe . The remaining 68 % of the universe of discourse ’s makeup is attributed to dingy get-up-and-go , which is creditworthy for the accelerating pace of the universe ’s expansion .

A sample of simulations showing a model of the expanding universe (left) and a zoomed-in view of tracer particles (right).

A sample of simulations showing a model of the expanding universe (left) and a zoomed-in view of tracer particles (right).Image: Argonne National Laboratory, U.S Dept of Energy

How does Frontier change our understanding of the universe?

“ If we were to simulate a big chunk of the population surveyed by one of the full-grown telescopes such as the Rubin Observatory in Chile , you ’re talking about calculate at huge clod of time — 1000000000 of old age of expansion , ” Habib say . “ Until recently , we could n’t even suppose doing such a gravid simulation like that except in the gravity - only approximation . ”

In the top graphic , the left over range of a function shows the evolution of the expanding cosmos over billions of year in a region containing a clustering of galaxies , and the correct image shows the shaping and movement of galaxy over time in one section of that image .

“ It ’s not only the gauze-like size of the forcible demesne , which is necessary to make unmediated comparison to New sight observation enabled by exascale computing , ” said Bronson Messer , the conductor of science for Oak Ridge Leadership Computing Facility , in a laboratoryrelease . “ It ’s also the added strong-arm realism of including the baryons and all the other dynamic physics that makes this simulation a rightful tour de forcefulness for Frontier . ”

Tina Romero Instagram

Frontier is no longer the fastest supercomputer in the world

Frontier is one of several exascale supercomputer used by the Department of Energy , andcomprisesmore than 9,400 CPUs and over 37,000 GPUs . It domiciliate at Oak Ridge National Laboratory , though the late simulations were run by Argonne researchers .

The Frontier results were possible thanks to the supercomputer ’s computer code , the Hardware / Hybrid Accelerated Cosmology Code ( or HACC ) . The fifteen - year - old computer code was updated as part of the DOE ’s $ 1.8 billion , eight - yearExascale Computing Project , which concluded this twelvemonth .

The simulations ’ resolution were announced last month , when Frontier was still the flying supercomputer in the world . But shortly after , Frontier was overshadow by the El Capitan supercomputer as the world ’s degraded . El Capitan is control at 1.742 quintillion computation per second gear , with a total peak performance of 2.79 quintillion calculation per second , according to a Lawrence Livermore National Laboratoryrelease .

Dummy

AIDark matterSimulationsSupercomputersthe universe of discourse

Daily Newsletter

Get the best tech , scientific discipline , and culture news in your inbox daily .

news program from the future tense , delivered to your present .

You May Also Like

James Cameron Underwater

Anker Solix C1000 Bag

Naomi 3

Sony 1000xm5

NOAA GOES-19 Caribbean SAL

Ballerina Interview

Tina Romero Instagram

Dummy

James Cameron Underwater

Anker Solix C1000 Bag

Oppo Find X8 Ultra Review

Best Gadgets of May 2025

Steam Deck Clair Obscur Geforce Now

Breville Paradice 9 Review