Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Replace "hallucination" with "oversight" or "ignorance" and you have the same issue when a human writes the code.

A lot of that will come to the prompter's own foresight much like the vigilance of a beginner developer where they know they are working on a part of the system that is particularly sensitive to get right.

That said, only a subset of software needs an authentication solution or has zero tolerance to some codepath having a bug. Those don't apply to almost all of the apps/TUIs/GUIs I've built over the last few months.

If you have to restrict the domain to those cases for LLMs to be "disastrous", then I'll grant that for this convo.

What about everything else?



> A lot of that will come to the prompter's own foresight

And, on the current trend, how on earth are prompters supposed to develop this foresight, this expertise, this knowledge?

Sure, fine, we have them now, in the form of experienced devs, but these people will eventually be lost via attrition, last even faster if companies actually do make good on their threat to replace a team of 10 devs with a team of three prompters (former senior devs).

The short-sightedness of this, the ironic lack of foresight, is troubling. You're talking about shutting off the pipeline that will produce these future prompters.

The only way through, I think, will be if (very big if) the LLMs get so much better at coding (not code-gen) that you won't need a skilled prompter.

Good luck with that.


> how on earth are prompters supposed to develop this foresight, this expertise, this knowledge?

I suppose curiosity. The same way anyone develops expertise in the abstractions below after getting excited about the higher layer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: