It will raise the bar for software development significantly.
GPT needs training data. Once a problem is "figured out" and there is enough information about it to reach a treshold where it's learnable by the AI, it will stop being something done manually.
There is reason to believe that most commodity software (e-commerce, content management, media asset management, any CRUD-with-makeup, infrastructure automation, etc) will be commoditized even further to the point that no software developer would be needed to create a product that leverages these approaches, or any reasonble combination of them.
Sure, there will be a need to guide the AI to the best solution. A need to harness it properly. Some software development skills will be needed for that, and my guess is that this will become a temporary occupation in the near future, either explicitly or not.
Eventually, there will be problems that large models can't solve. Not even partially. Stuff that has no training data and there is no way to acquire inferred knowledge about it. We'll know what that is only when the commoditization of what we already know is almost complete. My guess is that it will be something trivial but rarely considered worthy exploring, writing about or pursuing, but valuable. Some kind of impossible miracle software that will be not that impossible in the future (complete protocol/data portability, wide compatible standards across thousands of different systems, etc).
Or we all will have to learn how to flip burgers in a boring dystopia. That's a reasonable scenario as well.
> There is reason to believe that most commodity software (e-commerce, content management, media asset management, any CRUD-with-makeup, infrastructure automation, etc) will be commoditized even further to the point that no software developer would be needed to create a product that leverages these approaches, or any reasonble combination of them.
But then:
> Sure, there will be a need to guide the AI to the best solution. A need to harness it properly. Some software development skills will be needed for that, and my guess is that this will become a temporary occupation in the near future, either explicitly or not.
So, just like every past tool that will “replace software developers”, it will “replace” them with a technology which (1) requires people with software development skills to effectively deploy, and (2) greatly increases the output of the people with software development skills so employed, increasing the demand for software development skills.
And that’s even before considering the problems you speculate on that LLMs won’t even be partially applicable to.
Yeah, that is pretty much the idea. It will raise the bar.
This kind of stuff happens in cycles though. Once there was black and white photography, and hand-colouring those black and white pictures with a brush was a popular job in high demand. The color film changed that, moving those jobs to a niche reserved for artists.
Some decades later, Photoshop was popularized and touching up photos became a thing again. Something we didn't even imagined that was needed anymore, and now looks trivial, was suddenly on high demand. Some skills were transferred to the new thing, but we can't say people skilled in hand-colouring are good digital image artists though. Not generally.
A good hand-colourer might had focused on "never making a mistake" so it doesn't ruin the picture. Now that skill is irrelevant, we have infinite undos.
Software development nowadays favors pragmatism and values bit shaving abilities that are hard to hone. Complex solutions for problems are discouraged because it imposes a huge cognitive load on maintainers and refactorings.
ChatGPT might change that to favor broad generalist megalomaniacs who are able to come up with solutions involving exotic combinations of algorithms and techniques that would require an army of previous-generation developers to maintain. It might favor large refactorings and rewrites that are commonly frowned up in the current culture so no one thinks about them.
I don't actually know, my point is that the skillset might drastically change.
The biggest accelerator will be (once again) graphical GUIs. GPT will replace the keyboard, people will drag and drop stuff and connect pipes to draw and build anything
We've had Visual Basic and UML, and no-code visual coding tools for decades.
Especially GUI builders like Delphi and Visual Basic died with the advent of responsive screens.
They don't scale. No one wants to drag around logic elements visually.
Code is much more expressive, can be refactored and diffed easily.
No one wants to drag around *logic* elements visually. I agree.
What if you could just mock up everything and let the AI fill the blanks with an automatically generated backend? I'm not sure it's possible, but I would like that.
If this thing (useful generalist AIs) takes off, there will probably be no single silver bullet that represents its full potential.
But the hard work isn't writing backend CRUD code.
The hard part is actually deciding and describing what the backend should do. How should it work? What decisions should it make? What's the business logic it has to follow?
You'd want to describe that in an unambiguous way, so you'd have to invent a very strict syntax so that ChatGPT has no way of misunderstanding your intentions. You'd end up with a programming language eventually - just adding more complexity on your way, but you'd end up with code.
Now if you feel that writing boilerplate backend code is tedious, repetitive and verbose and could be automated, I totally agree.
Any tool that makes that easier is welcome.
But then a language model is the wrong tool for the job, we need more tools like PostgREST, ReTool, and Zapier to not end up writing too much unnecessary boilerplate code.
GPT needs training data. Once a problem is "figured out" and there is enough information about it to reach a treshold where it's learnable by the AI, it will stop being something done manually.
There is reason to believe that most commodity software (e-commerce, content management, media asset management, any CRUD-with-makeup, infrastructure automation, etc) will be commoditized even further to the point that no software developer would be needed to create a product that leverages these approaches, or any reasonble combination of them.
Sure, there will be a need to guide the AI to the best solution. A need to harness it properly. Some software development skills will be needed for that, and my guess is that this will become a temporary occupation in the near future, either explicitly or not.
Eventually, there will be problems that large models can't solve. Not even partially. Stuff that has no training data and there is no way to acquire inferred knowledge about it. We'll know what that is only when the commoditization of what we already know is almost complete. My guess is that it will be something trivial but rarely considered worthy exploring, writing about or pursuing, but valuable. Some kind of impossible miracle software that will be not that impossible in the future (complete protocol/data portability, wide compatible standards across thousands of different systems, etc).
Or we all will have to learn how to flip burgers in a boring dystopia. That's a reasonable scenario as well.