Time is irrelevant. What matters are units of computations.
When things are predictable they can be simulated fast : A spinning ball in the void can be simulated for 10^78 years in O(1).
When things are fuzzy, they can be simulated fast : A star made of huge number of atoms is not so different than another star made of a huge number of atoms. When processes are too complex they tend to all follow the law of large numbers which makes the computations memoizable.
What you want is a way to prevent the universe from taking shortcuts in its computations. Luckily its quite easy. You have to make details important. That's where chaos theory comes to the rescue. Small perturbations can have big impacts. Bifurcations like tossing a coin in the air create pockets of complexity. But throw too many coins in the air and its just random and boring. Life exists on this edge where enough structure is preserved to allow enough richness to exist.
One way humans have found of increasing precision is the lathe, which lead to building computers. Build a big enough fast enough computer and you will run-out of flops faster than reaching the 10^78 endgame.
But you have to be smart, because computation being universal it means that if you are just building a big computer what matters will be what runs on it. And your universe can be reduced to a recursive endgame state of "universe becoming a computer running universe simulation of a specific type", which doesn't need to computed more than once and already was, or isn't interesting enough to deserve being computed.
That's why we live on the exciting edge before the Armageddon, boring universes having already been simulated. The upside being universe hasn't yet decided which endgame we may reach, because the phytoplankton aliens of k2-18b have not yet turned on their supercomputer.
When things are predictable they can be simulated fast : A spinning ball in the void can be simulated for 10^78 years in O(1).
When things are fuzzy, they can be simulated fast : A star made of huge number of atoms is not so different than another star made of a huge number of atoms. When processes are too complex they tend to all follow the law of large numbers which makes the computations memoizable.
What you want is a way to prevent the universe from taking shortcuts in its computations. Luckily its quite easy. You have to make details important. That's where chaos theory comes to the rescue. Small perturbations can have big impacts. Bifurcations like tossing a coin in the air create pockets of complexity. But throw too many coins in the air and its just random and boring. Life exists on this edge where enough structure is preserved to allow enough richness to exist.
One way humans have found of increasing precision is the lathe, which lead to building computers. Build a big enough fast enough computer and you will run-out of flops faster than reaching the 10^78 endgame.
But you have to be smart, because computation being universal it means that if you are just building a big computer what matters will be what runs on it. And your universe can be reduced to a recursive endgame state of "universe becoming a computer running universe simulation of a specific type", which doesn't need to computed more than once and already was, or isn't interesting enough to deserve being computed.
That's why we live on the exciting edge before the Armageddon, boring universes having already been simulated. The upside being universe hasn't yet decided which endgame we may reach, because the phytoplankton aliens of k2-18b have not yet turned on their supercomputer.