Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> What I mean is that the current generation of LLMs don’t understand how concepts relate to one another.

They must be able to do this implicitly; otherwise why are their answers related to the questions you ask them, instead of being completely offtopic?

https://phillipi.github.io/prh/

A consequence of this is that you can steal a black box model by sampling enough answers from its API because you can reconstruct the original model distribution.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: