Is Reality a Simulation? New Claim Suggests Yes
“To me, ascending from the ant’s-eye view to the God’s-eye view of physical reality is the most profound challenge for fundamental physics in the next 100 years,”– Frank Wilczek. This provocation captures a tension that has quietly intensified across modern physics: our most successful theories explain how the universe behaves yet increasingly struggle to explain why it behaves as if it were optimized, compressed, and computationally efficient.
It is within this gap that physicist Melvin Vopson has advanced a controversial but mathematically grounded claim that information itself behaves like a physical quantity, governed by laws that resemble data management rather than randomness. Drawing on recent work in information dynamics, mass–energy equivalence of data, and entropy behavior in complex systems, Vopson argues that the universe consistently favors states that reduce informational load.
If his interpretation is correct, the implications are difficult to ignore: reality may not just look computable, it may be structured that way at its most fundamental level. And that raises a question physics has rarely had to confront so directly: is reality itself a simulation, or something uncannily close to one?
The Glitch in the Thermodynamic Matrix

In 2023, physicists Melvin M. Vopson and Serban Lepadatu published “Second law of information dynamics” in AIP Advances, showing that while the traditional second law of thermodynamics (entropy rises) holds for physical states, information entropy as defined in Shannon’s information theory can remain constant or even decrease over time in information‑bearing systems such as digital storage or RNA genomes.
Claude Shannon’s 1948 formulation of information entropy quantifies uncertainty in a set of outcomes, effectively measuring how much surprise a message contains, and this is where the new law changes how we might view physical systems. Data from SARS‑CoV‑2 RNA, for example, showed a decline in calculated information entropy over its mutation history, a finding the authors interpret as non‑random patterning rather than pure disorder.
According to Vopson, systems minimizing information entropy resemble data compression algorithms, a hallmark of computational efficiency rather than chaotic randomness.
That resemblance doesn’t prove a simulation, but it suggests an underlying informational regularity that classical thermodynamics alone hasn’t predicted.
The Weight of a Digital Soul
In a 2023 interview with Reuters, Vopson described information not just as an abstract structure but as a physical quantity that may carry mass, an idea he calls the mass‑energy‑information equivalence principle. He even provided a rough calculation: to account for dark matter via informational mass, you’d need about 10⁹³ bits of information at 2.73 K, enough to match the missing ~95 % of the universe’s mass‑energy budget.
That’s an extraordinary claim, one that merges the physics of matter with the physics of information, and it parallels older ideas like Seth Lloyd’s Programming the Universe, which argues the cosmos may behave like a quantum computer. If bits have mass, then matter itself might be a dense repository of information, blurring classical lines between information and material substance.
This is still hypothesized and unconfirmed experimentally, but the boldness of the claim has attracted attention because it ties cosmology, particle physics, and information theory into a single framework. Such thinking shifts the question from Is information important? To what might information be the substrate of reality?
DNA: The Universe’s Most Efficient Zip File

Vopson’s research extends into biology too, notably his analysis of SARS‑CoV‑2 genetic sequences, where he argues that observed mutation patterns show information entropy decreasing over time. This opposes the classical Darwinian view in which mutations are random, and selection is the filter, an interpretation that has long been accepted in evolutionary biology.
Instead, according to Vopson, genetic mutations may preferentially steer toward information states of lower entropy, a claim presented in AIP Advances and discussed in multiple science news outlets. In this view, the genetic code might resemble not just random variation but optimized coding, akin to how a programmer uses loops and redundancy elimination to streamline software.
If true, this would imply that biological complexity arises not just from chance plus selection but also from a deeper information‑centric organizing principle. While mainstream biology hasn’t adopted this interpretation, it does add a layer of pattern to the discussion that goes beyond mere analogy. At minimum, it highlights that information patterns are ubiquitous and potentially fundamental across domains.
What Science Actually Knows About Life After Death
The Pixelated Horizon of Reality
Computational metaphors are ancient in simulation theory: Konrad Zuse’s 1969 book Calculating Space proposed the universe might operate like a discrete cellular automaton: a real precursory philosophical model for a computational cosmos. Modern physics also identifies the Planck scale as the smallest meaningful unit of space‑time, hinting that continuous geometry may break down into discrete pixels.
While not every physicist agrees that discrete units imply computation, this idea resonates with information‑theoretic interpretations of quantum gravity. For proponents of the simulation idea, such discreteness mirrors how graphics engines only render what’s needed-a concept explored in Rizwan Virk’s The Simulated Multiverse, where he notes that quantum indeterminacy could be a form of rendering optimization.
Quoted in that work, Virk writes that in computation “only render that which is being observed so that not every particle in the whole universe has to be rendered at one time.” This doesn’t mean physics is a computer screen, but it does highlight a framework where discreteness + observer dependence = optimization, a notion that dovetails with information entropy trends.
Why the Universe Loves a Shortcut

One of Vopson’s most provocative claims is that symmetry the recurring regularity in structures from snowflakes to atomic orbitals, corresponds to states of lowest information entropy.
In his AIP Advances paper “The second law of infodynamics and its implications for the simulated universe hypothesis”, he argued that nature’s preference for symmetry mirrors the way software designers use loops and pattern reuse to minimize code.
“In physics,” Vopson explained, “we are observing excess information removed, resembling the process of a computer deleting or compressing waste code to save storage space and optimize power consumption.” This explicit comparison of natural efficiency to computational compression is what supporters highlight as a scientific indicator rather than mere metaphor.
However, critics note that symmetry can also arise from fundamental constraints such as conservation laws and group theory, which don’t require a simulator to explain. Even so, the observation is intriguing: a universe where patterns optimize themselves rather than fragment randomly is unexpectedly tidy.
The Minimum Energy Mandate
Vopson hasn’t limited his thinking to entropy alone. In a 2025 IFLScience discussion of his more recent paper, he proposed that gravity itself might be an entropic force, a phenomenon arising from the universe’s tendency to reduce information entropy, not a fundamental interaction. He suggests that when masses aggregate, the system’s informational description becomes simpler, thereby conserving computational load.
“Put simply, it is far more computationally effective to track and compute the location and momentum of a single object in space,” Vopson said, “than numerous objects.” This aligns with other proposals in theoretical physics that see gravity as emergent (e.g., Erik Verlinde’s entropic gravity models), though Vopson couches it in information‑centric terms. If gravity is a mechanism for minimizing informational cost, then even large‑scale structure (galaxies, clusters) may reflect deep optimization rather than just energetic dynamics.
10 inventions that killed their inventors
Erasing the Evidence
One of the boldest suggestions from infodynamics research is that information erasure might have detectable physical consequences. Drawing on ideas related to Landauer’s principle (which connects information erasure with heat), Vopson has proposed experiments involving particle‑antiparticle collisions to detect energy releases tied to information loss. This hypothetical experiment, if realized, would attempt to measure a direct connection between information dynamics and energy, not just infer it from patterns.
As Vopson put it in a press release, “This approach resembles the process of a computer deleting or compressing waste code to save storage space and optimize power consumption.” Finding such energy signatures would be remarkable, but so far, it remains a proposal, not a result.
Skeptics caution that no experiment to date has detected a simulation signature, and that mapping computation metaphors onto physical phenomena isn’t the same as empirical evidence.
The Cosmic Compression Algorithm

Information optimization isn’t just a biological or atomic curiosity; it’s proposed as a cosmological principle. Vopson’s 2023 AIP Advances article theorizes that the universe’s expansion, particle behavior, and structural regularities could be understood as processes that compress excess information, much like a data algorithm trimming redundancy.
If true, this would mean the laws governing everything from electrons to galaxies might be interpretable as patterns of informational refinement rather than arbitrary rules. Other researchers, such as Seth Lloyd in Programming the Universe, have similarly argued that viewing the cosmos as a quantum computer yields new insights into entropy, computation, and complexity.
Symmetry: The Programmer’s Signature
Symmetry remains a central theme in both physics and simulation speculation. In Vopson’s work, high symmetry states are argued to correspond to minimal information entropy, which could explain why symmetry is so prevalent in nature. This insight resonates with the broader scientific literature, which shows that symmetrical configurations often represent energy minima or stable solutions in physical systems.
In The second law of infodynamics and its implications for the simulated universe hypothesis, the analogy between symmetry and code efficiency is explicit: removing unnecessary information, like redundant code, saves storage space and computational effort. But experts like physicist Frank Wilczek criticize simulation accounts for assuming computation without evidence, describing such reasoning as begging the question.
Still, symmetry’s persistence across scales, from crystallography to cosmic microwave background patterns, keeps it at the heart of debates about informational underpinnings. Whether it signals programmatic structure or deep physical constraint, symmetry undeniably shapes the fabric of reality.
Beyond the Hardware

Looking back over these threads; information entropy, biological patterning, gravity as optimization, and symmetry as minimal structure, we are left with a provocative picture: reality exhibits efficiencies that mirror computational processes. Simulation proponents like Richard J. Terrile, a NASA astronomer, have argued that the idea of a computed universe is worth taking seriously because of its explanatory reach.
But mainstream science remains cautious: philosophical critiques warn that the simulation hypothesis begs the question by presuming a simulator without evidence of the simulator’s nature or existence. Moreover, recent astrophysical work suggests that the energy required to simulate an entire visible universe far exceeds plausible limits, even for advanced civilizations.
Still, the fact that information theory, from Shannon entropy to infodynamics, plays such a central role across disciplines suggests we haven’t yet exhausted its implications. Whether this leads to a simulation interpretation, a new physics framework, or a deeper understanding of how information and reality intertwine, the question remains open. And that, perhaps, is the most intriguing outcome of all.
KeyTakeaway:
Recent work in information physics, led by Melvin Vopson, suggests the universe may be governed by laws that treat information as a physical quantity, not an abstraction. When examined through entropy, symmetry, and energy efficiency, reality appears to behave less like a chaotic system and more like one optimized for data management. This does not prove we live in a simulation, but it strengthens the case that the universe operates as if it were computationally structured, pushing the simulation question from speculation toward testable science.
Disclosure line: This article was written with the assistance of AI and was subsequently reviewed, revised, and approved by our editorial team.
Like our content? Be sure to follow us
