Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Strictly speaking, it can, but its ability to do so is limited by its context size.

Which keeps growing - Gemini is at 2 million tokens now, which is several books worth of text.

Note also that context is roughly the equivalent of short-term memory in humans, while long-term memory is more like RAG.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: