AI inference's big bottleneck right now is RAM and memory bandwidth, not so much compute per se.
If we redid AI inference from scratch without consumer gaming considerations then it probably wouldn't be a coprocessor at all.
AI inference's big bottleneck right now is RAM and memory bandwidth, not so much compute per se.
If we redid AI inference from scratch without consumer gaming considerations then it probably wouldn't be a coprocessor at all.