Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I guess I never come across that situation because I just don’t engage with sources that fluff. That is a good example, but presumably, there should be no errors there because it’s just stripping away unnecessary stuff? Although, you would have to trust the LLM doesn’t get rid of or change a key step in the process, which I still don’t feel comfortable trusting.

I was thinking more along the lines of asking an LLM for a recipe or review, rather than asking for it to restrict its result to a single web page.



Doesn't matter if they get it wrong sometimes. So does human writers.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: