Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This only happens if you want it to one-shot stuff, or if you fall under the false belief that "it is so close, we just need to correct these three things!".

Yes I have encountered it. Narrowing focus and putting constraints and guiding it closer made the LLM agent much better at producing what I need.

It boils down to me not writing the code really. Using LLMs actually sharpened my architectural and software design skills. Made me think harder and deeper at an earlier stage.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: