Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My R730 averages at around 100-200W with two 12-core CPUs, 128GB RAM and 8 12-14TB drives.

Servers are more efficient than people give them credit for. It's close to a gaming PC in electricity usage. Certainly not 60W (or what people can get with NUCs and RPi clusters) but for the power I get, it's very much worth it.



If we average that out to 150W, that'd be 1250-1300Kwh/yr. Current Bay Area electric prices means that'd cost in excess of $500/yr to run. I'd say that servers are not more efficient than people are giving them credit for...


Servers are more efficient when fully utilized. A commercial grade server in a homelab is more likely to idle all the time, making it very energy wasteful. Servers need to be right-sized, instead of "more is better".


And at Seattle prices, that's $150, so I don't exactly see what the purpose of your anecdote is other than to mention how pricy your electricity is.

Edit: But even at $500, that's pretty much the price of a low-spec VPS or VM per year. So for a fairly low initial price to buy the server, you're getting far more performance for the same price you'd pay someone else to use theirs.


What's it like at idle though?

Since it's a home server, when it's not receiving backups from computers in the house, or streaming media via Plex it's sitting there at < 15w, which is low enough not to worry about.


I'd guess the issue isn't so much peak consumption but idle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: