Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

wait until the GPUs and data centers start getting written off for being obsolete in a couple years when we still have nothin but fancy auto complete.


Imagine if you spent those years building something else.


Yes, like renewable energy infrastructure (which China does, and would be highly useful anyway in case generative AI does live up to its promise).

Even if generative AI lives up to its hype, with current US administration there's no way America is going to lead the race for long. There's just not enough energy available, when those in power oppose developing many of the energy projects that make most economical sense.


I'd be glad to take any sweet obsolete enterprise HW off them.


Electricity bill would be huge. Great heaters tho


interesting questions, how will this unroll

right now I see even old GPUs like V100 are still popular. Maybe the old GPUs will shift to the countries with cheap electricity?


Probably could be repurposed for something else though?


You can already buy cheap but powerful old servers. But newer hardware tends to be more power efficient. So, depending on time horizon you consider, it might be cheaper to buy newer hardware.

Assuming that GPUs power efficiency will increase, the same will be true about them.


what else could possibly use that much compute? especially somewhat specialized compute, not suited for general purpose compute?


Mining crypto if we're unfortunate.


The one truly hopeful aspect of a bubble-burst scenario is that extra capacity always finds a use-case, and in this case practically any other use-case would be both less harmful and more real.


Before the AI craze, all the GPUs were being bought up by cryptocurrency miners, and I'm not sure that's better. Even as an AI skeptic I think AI is a better use of all this hardware than cryptocurrency.


You're right of course, I'm taking for granted that we've been there and done that. The question is, would we do it again and would that usage really be allowed to expand to soak up whole data centers? We might since I don't think a cynic has made a bad prediction in years..

IMO the most likely way to soak up the extra capacity is actually yet another iteration of AI rather than say, doing productive but boring work with any other techniques for curing disease or something. Still, a crash and a next iteration might be more likely to involve fresh ideas on architecture, or focus on smaller expert models that have less fake results and actually empower users. Right now I think there's a clear bias in research and execution. OFANG does want results, but also wants results that tech giants. Are subsymbolic techniques really the best techniques, or are they just the best at preserving the moat of big-data and big-compute?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: