Damn, this is a buzzkill for me. I just built a new computer a month ago and was hoping to get a Hackintosh setup working but didn't do much research into it beforehand. Why will Nvidia GPUs never work? Because they won't publish open source drivers to work off of?
Because Apple computers don't use nVidia cards, and nVidia no longer makes their own Mac drivers (as they did until recently). Porting the Linux driver would be a massive amount of effort, even if nVidia was more open.
If you have a slightly older nVidia GPU (1000 series or older), it will work in High Sierra via nVidia's in-house drivers. But you will never be able to upgrade past High Sierra.
AFAICT the answer is that Nvidia and Apple are not the best of friends, and Apple decided that they would not sign any newer drivers.
This may have been at least partly because it restricts the numbers of people who try making hackintoshes out of existing machines.
I think this is probably a factor in the lack of support for intel Wifi too - it keeps a lot of laptops from neing able to 'just work' as hackintoshes too
> Apple decided that they would not sign any newer drivers.
People think this based on a vague statement from nVidia's PR. It doesn't make any sense to me. nVidia has continued to release (minor) updates for the High Sierra drivers, and Apple has been perfectly happy to sign those. Also, this is a technically-advanced audience that would probably have no trouble installing unsigned drivers.
I really think that nVidia realized they would have had to do a major driver rewrite for Mojave (because much more rendering goes through Metal) and decided it wasn't worth the effort. The Mac drivers were clearly a very low priority for nVidia; it took the better part of a year before the company added compatibility with their new Pascal graphics cards, and even then the drivers had all sorts of bugs with applications like Little Snitch[1]. It's even possible that nVidia had some knowledge of the upcoming ARM transition, and decided to cut their losses ahead of time.
Depends on your definition of "allow". By default they're blocked, yes, but you can change that by booting into recovery mode, opening a Terminal window, and typing:
csrutil disable && csrutil enable --without kext
Now unsigned kexts will be allowed to load (while leaving the rest of System Integrity Protection intact).
For most software this requirement would be a massive problem, but I'm not sure that applies to the type of person installing 3rd party graphics cards in Macs...
The nVidia Web Drivers never officially supported eGPUs anyway. In order to do it at all you had to use a hack which—surprise!—also required disabling System Integrity Protection!
There is a hack that allowed me to use the Nvidia web drivers on Mojave on my 2012 15" rMBP. Not sure if it works on Catalina. You have to reapply it every OS update. I don't think there is a hack to get CUDA working, which was last supported on High Sierra.
Not sure if the same hack could be used on a Hackintosh (with an old enough card). But for now I would just use High Sierra if you have an Nvidia card, unless you really needed something in a later OS and the hack made it possible. I downgraded that 2012 to High Sierra, and keep other older machines on it as well (didn't have a choice with my 2019 16"). It runs very smoothly and currently has enough software support (e.g. latest Office 365 still supports it).
Aww, I didn't realize that Nvidia stopped making the macOS driver. I was hoping they might stick with it because the new Mac Pro has proper PCIe slots.