There's an argument to be made that this gap is actually highlighting design issues rather than AI limitations.
It's entirely possible to have a 100k LOC system be made up of effective a couple hundred 500 line programs that are composed together to great effect.
That's incredibly rare but I did once work for a company who had such a system and it was a dream to work in. I have to think AIs are making a massive impact there.
> It's entirely possible to have a 100k LOC system be made up of effective a couple hundred 500 line programs that are composed together to great effect.
I'm confused. Are you imagining a program with 100k LoC is contained in a single file? Because you'd be insane to do such a thing. It's normally a lot of files with not LoC each, which de facto meets this criteria.
You may also wish to look at UNIX Philosophy. The idea that programs should be small and focused. A program should do one thing and do it well. But there's a generalization to this philosophy when you realize a function is a program.
I do agree there's a lot of issues with design these days but I think you've vastly oversimplified the problem.
> It's entirely possible to have a 100k LOC system be made up of effective a couple hundred 500 line programs that are composed together to great effect.
To me, this sounds like an nightmare—I'm sure anyone who's worked at a shop with way too many microservices would agree. It's trivial to right-click a function call and jump to its definition; much harder to trace through your service mesh and find out what, exactly, is running at `load-balancer.kube.internal:8080/api`.
It's entirely possible to have a 100k LOC system be made up of effective a couple hundred 500 line programs that are composed together to great effect.
That's incredibly rare but I did once work for a company who had such a system and it was a dream to work in. I have to think AIs are making a massive impact there.