Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can't get more info from LLMs than it actually holds. Like Anthropic pointed if LLMs knows the name but has no other info it starts hallucinating. The same probably happens here. LLM knows there must be a flag but can't remember all of them. Likely short reminder in prompt will help. (or search web for GPT) Just my $0.02.


It certainly feels like you can just by challenging it; then it happily finds other paths to what you want. So maybe internally it needs a second voice encouraging it to think harder about alternatives upfront.


The fact that you can more info from an LLM than it holds is actually a pithy description of this whole challenge.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: