Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

All LLMs are, it's an innate thing. Google just sucks at the kind of long context training you need to do to mitigate that.




I would bet they won't suck at it for much longer, Gemini's progress in undeniable.

It was a consistent weak point for Gemini, compared to other major AIs. Reportedly, still is.

The progress is undeniable, the performance only ever goes up, but I'm not sure if they ever did anything to address this type of deficiency specifically. As opposed to being carried upwards by spillover from other interventions.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: