Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Awesome. I'm installing on Ubuntu 22.04 right now.

Ran into a few errors with the default instructions related to CUDA version mismatches with my nvidia driver. Now I'm trying without conda at all. Made a venv. I upgraded to the latest that Ubuntu provides and then downloaded and installed the appropriate CUDA from [1].

That got me farther. Then ran into the fact that the xformers binaries I had in my earlier attempts is now incompatible with my current drivers and CUDA, so rebuiding that one. I'm in the 30-minute compile, but did the `pip install ninja` as recommended by [2] and it's running on a few of my 32 threads now. Ope! Done in 5 mins. Test info from `python -m xformers.info` looks good.

Damn still hitting CUDA out of memory issues. I knew I should have bought a bigger GPU back in 2017. Everyone says I have to downgrade pytorch to 1.12.1 for this to not happen. But oh dang that was compiled with a different cuda, oh groan. Maybe I should get conda to work afterall.

`torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 30.00 MiB (GPU 0; 5.93 GiB total capacity; 5.62 GiB already allocated; 15.44 MiB free; 5.67 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF`

Guess I better go read those docs... to be continued.

[1] https://developer.nvidia.com/cuda-downloads?target_os=Linux&...

[2] https://github.com/facebookresearch/xformers



Also got this far on my 3080 Ti with the same error message. Oh well, let's wait for the "optimized" forks to pop up.


Thanks for reminding me why I shouldn't go to my computer right now and try getting this working with my 2070!


Which GPU are you using? Used RTX 3090s were relatively cheap in the last couple of weeks...


GeForce GTX 1060 6GB, purchased literally 5 years ago. It worked with an optimized stable diffusion 1.0 so I was hopeful here. If I want to run these models going forward I guess I need something slightly more serious, eh?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: