I’m going through the multi-year phase of thinking we live in a simulation (and I’m not sure what phase comes afterwards). From an engineering perspective, the sheer hugeness of this simulation, encompassing the visible universe at least, is quite majestic. Just from the perspective of compute, there is a hell of a lot being used in everything, from simulating all the atoms in the Pacific Ocean right now to simulating my mind as it types this sentence. So much compute that it seems (to this puny human mind) borderline impossible that it’s ‘just a simulation’.
They must be using tricks, like we use KV caching and speculative decoding in transformer models, to make the simulation more efficient. One way I was thinking about these simulation tricks was through the question of a cube at any given time: does simulating 1m^3 of entirely empty space use the same amount of compute as simulating 1m^3 at the centre of a star? Intuitively, from an information theory stance, the star has way more going on inside. Although, to get an accurate simulation of that star, perhaps you need that lightyears-away cube of empty space to be simulated accurately too, butterfly-effect style.