Maybe we've been fooling each other since forever too.
However whatever we're doing seems to be different from what LLMs do, at least because of the huge difference in how we train.
It's possible that it will end up like airplanes and birds. Airplanes can bring us to the other side of the world in a day by burning a lot of fuel. Birds can get there too in a much longer time and more cheaply. They can also land on a branch of a tree. Airplanes can't and it's too risky for drones.
Precursor as in it will help in synthetic data generation, testing, etc. at scale that gives us more powerful models. It is a necessary intermediate step on our path to AGI.
What are you basing this claim on? There is no intelligence in an LLM, only humans fooled by randomness.