I think their point was that there is no empirical definition of information as it relates to the observer. The expurimint you cite worked upon a physical system that already had a state prior to the expurimint.
If everything is information, then nothing is.
A disordered system still has state. You just don't know what it is.
Fair point on the semantics. But I'm not talking about subjective 'knowledge.' I'm talking about the thermodynamic cost to maintain a state.
Landauer showed that information processing is physical (heat). I’m just extending that logic: if the universe has to process too much state data in one spot, the cost isn't just heat—it's lag (Time Dilation).
It doesn't matter if we observe the mess; the system still has to render it.
Valid point. I'm reaching out to academia too. I posted here because my theory treats spacetime as a computational substrate, and HN has the best mix of physicists and engineers to critique that specific angle.
Spot on. An Event Horizon indeed represents the theoretical limit of information density (the Bekenstein bound).
Since I can't create one in the lab, I'm betting on GHZ states to generate a steep enough local information gradient to yield a measurable effect. It's a scale-down, but unlike a black hole, we can test it today.
Thanks for the sharp question. You hit the core challenge.
I am targeting a sensitivity of 10^{-18} seconds, which is within the range of modern Sr-87 optical lattice clocks (current stability \approx 10^{-19}).
While the effect of information density (\Delta S_{info}) is expected to be extremely subtle compared to mass (G), the differential measurement (Entangled vs. Non-entangled) allows me to filter out common-mode noise. Even if I get a null result, establishing an upper bound on the coupling constant \alpha would be a significant contribution.
I'm putting my bet on the high complexity of the GHZ state.
English isn’t my first language, and I’m an independent researcher.
I supplied the core intuition and overall architecture, and used an LLM as a research assistant for formal derivations and calculations.
Think of it as a human architect using modern tools to draft blueprints.
I've been exploring a speculative idea: what if some of the "strange" phenomena in physics—like gravitational time dilation—aren't fundamental forces, but emergent effects of a universe with finite computational resources?
In this preprint, I model the universe as a Universal Computing System (UCS). The core hypothesis is what I call Information-Induced Time Dilation (ITD): regions with high information density may experience a local "processing lag," which we observe physically as time dilation.
Rather than replacing General Relativity, the idea is to extend it by adding an information entropy term to the stress-energy tensor. Importantly, the paper also outlines a concrete experimental test using Sr-87 optical lattice clocks that could, in principle, distinguish this effect from standard GR predictions.
I'd really appreciate feedback from people in systems, distributed computing, and physics:
Does it make sense to think of spacetime as having computational bottlenecks, latency, or throughput limits?