There is not a limited amount of software engineering that can be done. There's only the amount of software engineering that it is _economical_ to do at any given point. If AI makes software development cheaper and more efficient, people will just use apply it to more use cases. It's never been the case that making programming cheaper has lead to fewer programmers -- quite the opposite.
This change is roughly analogous to the shift from punch cards to compilers. It's just a more "natural language" way to do programming. A lot of the drudgery associated with coding will go away and competent programmers will shift to higher level design work.
Even in a future where AI is better than human software engineers at every single programming task (which I don't believe will be the case any time soon), there is still comparative advantage. AI will not have the _capacity_ to do every single programming task and there's going to still be lots of work for people to do.
The problem is that AI has the potential to turn a professionalized industry into a self-help one. Like, if I can pay $30 a month to have an AI make me several new apps a day, why would I need an app store? It's like fast fashion for programs. Low quality but dirt cheap.
But if building an app is as easy as voicing a requirement, and getting an updated version is as simple as asking for a change, then why should there be an ecosystem at all, rather than just bespoke on-the-fly apps for every single individual?
As a tangent, I've been thinking about this in regards to the edtech space. For now, the focus is on GenAI tools for quickly building courses, but if we have AI with the combined skills of a Subject Matter Expert, Instructional Designer and Tutor, why would we need to build courses at all, if instead it could just teach anyone anything on the fly, based on their specific background and interests, and adapting immediately to their reactions?
We've already been talking a lot about bubbles / echo chambers over the last decade, and I imagine that this will get a lot worse/better over the coming decade, depending on your perspective of the value of shared experiences.
I think a lot of our plans with AI are skeuomorphic. (Skeuomorphs are when an older, outdated thing is preserved in the updated thing, like how the Save icon is a floppy disk.)
"Generative AI will mean that Pixar movies will be made by AI!" Likelier it means that generative AI will create an entirely new medium beyond movies.
This change is roughly analogous to the shift from punch cards to compilers. It's just a more "natural language" way to do programming. A lot of the drudgery associated with coding will go away and competent programmers will shift to higher level design work.
Even in a future where AI is better than human software engineers at every single programming task (which I don't believe will be the case any time soon), there is still comparative advantage. AI will not have the _capacity_ to do every single programming task and there's going to still be lots of work for people to do.