Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Consumer hardware and AI inference are joined at the hip right now due to perverse historical reasons.

AI inference's big bottleneck right now is RAM and memory bandwidth, not so much compute per se.

If we redid AI inference from scratch without consumer gaming considerations then it probably wouldn't be a coprocessor at all.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: