According to the standard model of particle physics, the universe is governed by four fundamental forces: electromagnetism, the weak nuclear force, the strong nuclear force and gravity. While the first three are described by quantum mechanics, gravity is described by Einstein’s theory of general relativity. Surprisingly, gravity is the one that presents the greatest challenges to physicists. Although the theory accurately describes how gravity works for planets, stars, galaxies, and clusters, it does not apply perfectly to all scales.
While general relativity has been validated many times over the past century (starting with the Eddington Eclipse experiment in 1919), gaps still appear when scientists attempt to apply it at the quantum scale and Universe as a whole. According to a new study led by Simon Fraser University, an international team of researchers tested general relativity on the largest of scales and concluded that it might need a tweak or two. This method could help scientists solve some of the biggest mysteries facing astrophysicists and cosmologists today.
The team included researchers from Simon Fraser, the Institute of Cosmology and Gravitation at the University of Portsmouth, the Center for Particle Cosmology at the University of Pennsylvania, the Osservatorio Astronomico di Roma, the UAM- CSIC Institute of Theoretical Physics, Lorentz Institute, Leiden University. , and the Chinese Academy of Sciences (CAS). Their findings appeared in a paper titled “Cosmological Tension Imprints in Reconstructed Gravity,” recently published in Natural astronomy.

Delete all announcements on the universe today
Join our Patreon for as little as $3!
Get the ad-free experience for life

According to Einstein’s field equations for GR, the Universe was not static and had to be in a state of expansion (otherwise the force of gravity would cause it to contract). While Einstein initially resisted this idea and attempted to propose a mysterious force that kept the Universe in balance (his “Cosmological Constant”), Edwin Hubble’s observations in the 1920s showed that the Universe is expanding. Quantum theory also predicts that the vacuum of space is filled with energy that goes unnoticed because conventional methods can only measure changes in energy (rather than its total amount).
In the 1990s, new observatories such as the The Hubble Space Telescope (HST) has pushed the boundaries of astronomy and cosmology. Thanks to surveys like Hubble Deep Fields (HDF), astronomers have been able to see objects as they appeared more than 13 billion light-years away (less than a billion years after the Big Bang). To their surprise, they discovered that for the past 4 billion years, the rate of expansion has been accelerating. This has led to what is known as the “old cosmological constant problem”, where gravity is weaker on a cosmological scale, or some mysterious force is driving cosmic expansion.
Lead author Levon Pogosian (Professor of Physics, Simon Fraser University) and co-author Kazuya Koyama (Professor of Cosmology, University of Portsmouth) summarized the matter in a recent article via The conversation. As they explained, the problem of the cosmological constant boils down to a single question with drastic implications:
“[W]or the vacuum energy actually gravitates – exerting a gravitational force and altering the expansion of the universe. If so, then why is its gravity so much lower than expected? If the vacuum does not gravitate at all, what causes the cosmic acceleration? We don’t know what dark energy is, but we have to assume it exists to explain the expansion of the universe. Similarly, we must also assume that there is a type of invisible matter present, called dark matter, to explain how galaxies and clusters evolved into how we observe them today.

The existence of dark energy is part of standard cosmological theory known as the Lambda Cold Dark Matter Model (LCDM) – where Lambda represents the cosmological constant/dark energy. According to this model, the mass-energy density of the Universe consists of 70% dark energy, 25% dark matter and 5% normal (visible or “light”) matter. Although this model successfully matches observations collected by cosmologists over the past 20 years, it assumes that most of the Universe is made up of undetectable forces.
This is why some physicists have ventured that GR might need some modification to explain the Universe as a whole. Also, a few years ago, astronomers noted that measuring the rate of cosmic expansion in different ways produced different values. This problem, Pogosian and Koyama explain, is known as the Hubble tension:
“Disagreement, or tension, is between two values of the Hubble constant. One is the number predicted by the LCDM cosmological model, which was developed to match the light left behind by the Big Bang (cosmic microwave background radiation). The other is the rate of expansion measured by observing exploding stars called supernovae in distant galaxies.
Many theoretical ideas have been proposed to modify the LCDM model to explain the Hubble voltage. Among them are alternative theories of gravity, such as Modified Newtonian Dynamics (MOND), a modified version of Newton’s Law of Universal Gravitation that suppresses the existence of dark matter. For more than a century, astronomers have tested GR by observing how the curvature of spacetime is altered in the presence of gravitational fields. These tests have become particularly extreme in recent decades, including how supermassive black holes (SMBH) affect stars in orbit or how gravitational lenses amplify and alter the passage of light.

For the purposes of their study, Pogosian and his colleagues used a statistical model known as Bayesian inference, which is used to calculate the probability of a theorem as more data is introduced. From there, the team simulated cosmic expansion based on three parameters: CMB data from ESA’s Planck satellite, supernova and galaxy catalogs like the Sloan Digital Sky Survey (SDSS), and the Dark Energy Survey (DES), and predictions from the LCDM model. .
“With a team of cosmologists, we put the basic laws of general relativity to the test,” Pogosian and Koyama said. “We also explored whether modifying Einstein’s theory could help solve some of the open problems of cosmology, such as the Hubble strain. To determine if GR is correct on the largest of scales, we undertook , for the first time, to study three aspects of it simultaneously: the expansion of the Universe, the effects of gravity on light and the effects of gravity on matter.
Their results showed some inconsistencies with Einstein’s predictions, although they had rather low statistical significance. They also found that it was difficult to solve Hubble’s voltage problem by simply modifying the theory of gravity, suggesting that additional force might be needed or that there were errors in the data. If the former is true, Pogosian and Koyama said, then it’s possible that this force was present at the beginning of the Universe (about 370,000 years after the Big Bang) when protons and electrons combined to create hydrogen.
Several possibilities have been advanced in recent years, ranging from a special form of dark matter, an early type of dark energy, or primordial magnetic fields. In any case, this latest study indicates that there is future research to be done that could lead to a revision of the most widely accepted cosmological model. Pogosian and Koyama said:
“[O]Our study demonstrated that it is possible to test the validity of general relativity over cosmological distances using observational data. Although we haven’t solved the Hubble problem yet, we will have a lot more data from new probes in a few years. This means that we will be able to use these statistical methods to continue to refine general relativity, to explore the limits of modifications, to open the way to solving some of the open challenges in cosmology.
Further reading: conservation, Nature Astronomy
#Einsteins #predictions #gravity #tested #largest #scale