Not saying training set or present LLM. Truly random binary generator left to its own device. Lets evaluate what spits out from that iterated several trillion times over with the massive compute capability we will have. I am not thinking of this happening in the next couple years, but in the next couple of centuries.
We also burn carbon to feed the brain. Compute is what is increasing in capability on the scale of orders of magnitudes just within our own lifetimes. Brainpower is not increasing in capability. If you want future capabilities and technological advancement to occur at the fastest pace possible, eventually we have to leave the slow ape brain behind in favor of sources of compute that can evaluate functions several orders of magnitude faster.
Code derived from a training set is not at all "random."