Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's no such thing as long as you buy an actual mid to high tier GPU. Even an ancient GTX1070 would be more than enough - and for sufficiently large datasets even an RTX3090 will take hours to process whatever you're crunching.

Just buy a PC that you like for gaming(with an Nvidia gpu) and don't worry about ML yet - it's incredibly unlikely that you can pick something that would limit you in any way. Small datasets will run on anything, large datasets will take hours to process no matter what you run them on. It's not a "limit".



Some off-the-shelf gaming PCs are not very Linux friendly though, so they should watch out for that, especially the laptop varieties. Getting a lot of the ML stuff working locally in Windows is a nightmare.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: