> Sure it can write a fibonacci function in Javascript, but so can I and I can write software I wasn't preprogrammed to do, and solve issues I have never encountered before
Sure, but how much programming is truly original? Unless a company is working on some novel research topic, most programming is either a regurgitation of the same solutions ("we're the X of Y"), a glue that binds several solutions, or a friendlier interface to an existing solution. In all those scenarios it will be much cheaper and faster to get an AI to build it, than to hire a human team. Or at the very least, instead of hiring a large team, a smaller team of 1 or 2 humans could serve as code reviewers for the AI.
So I think this advancement is an existential threat to a large sector of our industry.
And the shift will happen much earlier than some people in this thread think. If not this generation of GPT-3, then one or two generations after that. A couple of years? It certainly won't require AGI.
Maybe I'm an abnormal programmer, but writing code is not the bulk of my job. Solving problems is. Once the solution has actually been defined, the code is usually the easy part.
And usually the solution requires taking into consideration the entire context of the system as it exists so far, as well as making good judgement calls about what will be required in the future. Then, communicating the tradeoffs to a team, and helping them make a decision from among our options.
I don't want to crap on an impressive tool, but "imitate a programmer's code, 50-100 lines at a time" is not the same thing as "do the job of a programmer."
Indeed, writing code is the minor part of being a senior engineer. It's a "doing" thing, which AI is getting to the point of being decent at, mostly by copying. Your ChatGPT or Github Copilot are still unable to implement an algorithm no one has written before.
And solving problems? Needs "understanding" and in many cases, "lateral thinking", two features that are not possible with contemporary AI systems until AGI is a thing, and that one is still is science fiction. But solving problems is still the main reason people hire me.
I've recently been working on a relatively straightforward "glue" library connecting A to B, except B is somewhat obscure and this requires complex maintenance of state. ChatGPT doesn't have a clue.
If you just want it to regurgitate Javascript boilerplate that's been written a million times before, yeah, I'm sure it can do that. Tons of software development isn't that.
The specific way in which you glue is the original part. In many cases not very hard and there are many common patterns, but for now an operator is required for that.
It'll revolutionize search / discovery for questions you don't know and optimize rote tasks for questions you do. You might be right that this reduces the number of programmers you need, but historically I don't think this has been true.
Sure, but how much programming is truly original? Unless a company is working on some novel research topic, most programming is either a regurgitation of the same solutions ("we're the X of Y"), a glue that binds several solutions, or a friendlier interface to an existing solution. In all those scenarios it will be much cheaper and faster to get an AI to build it, than to hire a human team. Or at the very least, instead of hiring a large team, a smaller team of 1 or 2 humans could serve as code reviewers for the AI.
So I think this advancement is an existential threat to a large sector of our industry.
And the shift will happen much earlier than some people in this thread think. If not this generation of GPT-3, then one or two generations after that. A couple of years? It certainly won't require AGI.