Hacker Newsnew | past | comments | ask | show | jobs | submit | akomtu's commentslogin

How do you dissipate so much heat in vacuum? Datacenters will need to boil something like water and then dump it somewhere.

Just like satellites do: heat pipes and radiators.

Not that space DCs are a good idea™ or economical otherwise.


Economics are not the main concern if nobody wants them on Earth.

Sats are optimized for low energy usage. DCs, on the other hand, are water boilers.

Do you think data centers and the average satellite generate an equivalent amount of heat?

Those executives are simply implementing the directive to inject as much AI as possible into every gear of the economy. Their bonuses depend on this. The idea is that if the world economy becomes dependent on this AI monstrosity, we won't be able to get rid of it. It will be like a situation with a nasty parasite that does a lot of harm, but cannot be removed without the host dying.

That sounds just like Microsoft, Facebook or Oracle products (in some circles). What I'm trying to say it is classic strategy done many times before with various tools (and I hate the perpetrators every time).

AI datacenters are effectively boilers that burn gas, heat the atmosphere and boil rivers. Remember that GPUs turn all electricity into heat. As a side effect, they also produce slop.

Is this much different from non-AI datacenters?

Of course it's a lie. Cloudflare is saying, essentially: "AI is making us so profitable that we've decided to reduce our profit by 20%, to keep it reasonable."

But they’re not profitable? They make 450k per employee revenue, but lose 17k profit. Meanwhile they spend 470 million in stock based compensation for example, up 100 mil from year before, on 5k employees, which they’ve been increasing a lot every year.

by laying people off they increase their profit, at least in the short term (which is all that shareholders care about)

Not with the severance package they're offering, which is why their stock was down between 15-18% after announcing this

Look at the chart of their stock price over the past couple of months. There was a huge run that started literally just over a week ago. Even after this 20% drop, the price today is only slightly below where it was before that run.

Their stock price has been pretty volatile for a while now (6+ months), so even with a swing of this magnitude I don't think it's valid to see it as much more than a correction.


They’ve been pumping out products like crazy

They don’t need them. Simple as that


someone has to maintain a he products

More AI?

Good luck with that.

I'm sure that more AI will solve that problem too.

I am confused by this post. No trolling: You wrote "reduce". Did you mean to say/write "increase"? If you layoff people to reduce costs, then your profitability should increase.

That’s a very MBA way of thinking.

If we extend the logic, if we have 0 employees then profitability is maximized right? Then shouldn’t every company have 0 employees?

Obviously hiring increases profitability, otherwise some of the biggest headcount companies wouldn’t have hired so many people


It's on-device AI spyware, really. It collects intelligence about the user, summarizes it and sends it to Google, all paid by the user's electricity bill. Deviously clever.

How can a deterministic machine be conscious? Can we call the multiplication table conscious? It too has inputs and deterministic outputs.

I think the obvious question is are humans deterministic? A lot of inputs but it's a reasonable belief that humans are in fact deterministic.

How is it a reasonable belief that a highly complex entity beyond our comprehension is a deterministic machine? Aren't deterministic machines simply the limit of our knowledge for now?

Except the human mind isn't at all just "software". If the human brain is deterministic, nothing is not.

The human brain doesn't have "a lot of" inputs, but rather infinite inputs. Cosmic rays, (self-emitted) electromagnetic fields, bacterial/viral activity, nutrition, genetics, epigenetics, immunity, cellular function ... all these things effect a mind. There is homeostasis, but that's not like error correction in silicon computation. Neurons do have excitation thresholds which are somewhat digital, but they are embedded in analog signaling and interference.

Row-hammer-like interference is a normal state of affairs for the brain. You cannot core-dump a mind. Measurements will change its state since it's inherently linked to the state of its matter. You could halt an LLM and predict its state the next cycle going by the program's logic. Or you could halt it, copy the state and get two identical instances. To clone a brain, you likely need to halt time itself.

Semantics aside, there is clearly a different deterministicness.


> The human brain doesn't have "a lot of" inputs, but rather infinite inputs.

That's not true though. It's 'a lot', not infinite. Not everything affects the output that our brain produces.

As far as we're currently aware the brain IS deterministic. If you were able to perfectly duplicate a brain and it's environment/state, the resulting output of that brain will always be the same.


> Not everything affects the output that our brain produces.

It responds to EM fields...so yeah, basically infinite.

> If you were able to perfectly duplicate a brain and it's environment/state

Big if. As I said, if the brain is deterministic, everything is. And then it's a meaningless discriminator. I already explained why I think you can't duplicate the state/environment perfectly.


They aren't. However there is a coordinated effort to push this pseudo-philosophy on masses. On the one hand it degrades the idea of human consciousness or soul, calling it a fiction. On the other hand it props the AI, calling its pile of transistors almost brain-like.

> it degrades the idea of human consciousness or soul, calling it a fiction

I guess they usually mean that is fiction the part of the story where is said that it is separated from the brain


It kind of needs to be separate, otherwise it is just consciousness.

How can something inexplicable like consciousness be considered "degraded"?

Who is coordinating this effort?

Whoever profits from it.

AI makes us believe that instead of working towards a goal, one can "win" that goal with a lucky prompt. AI replaces thinking with gambling, in other words, and it's very tempting to many.

"How a Broken Bike Sync Led Me to Reverse Engineering My Wahoo's Hidden Debug Mode" - this is brain-dead AI slop right in the title.

"Hand writing" your own thoughts is the only true way, though. If some entity does your thinking, then it's no longer you.

Yes, now that's a reasoned though on how AI will affect us, but fortunately - the AI is not 'doing our thinking for us' any more that 'calculators did', and, that's not going to stop us from using AI.

People not using AI will be about as useful as those refusing to use e-mail or computers.

It's absurd.


Doing our thinking for us is the purpose of AI, isn't it? It's called artificial intelligence for a reason.

AI is a broad term and ML aglos for playing chess fall under that since the 1920s.

AI may replace some cognitive activity, it also required cognitive intelligence to use 'slide rules' - which have been replaced and we have not looked bad.

It's not a bad rhetorical question - but it's moot in the face of the question of 'should we use it or not'.

It will do a lot of things for us - that part is inevitable and unavoidable.

We'll have plenty to think about.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: