Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

[dead]


You're right. And it's also important to be mindful that the LLMs can also translate between human intent and formal queries incorrectly, so they still shouldn't be fully trusted even when integrated with a more deterministic system.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: