I wonder if servers could be located close to solar or even attached to individual panels. So that during times of excess you sell cheap computing resources. This could reduce demand on the electricity grid at peak times.
Many things are possible, but I don't think this particular idea is very promising from an economic standpoint.
First, you reduce the need for extra power cables but in exchange you now need internet cables going to all the solar panels.
Second, for any capital good you need to take into account the utilization rate during its lifetime. Imagine you have a server costing 10k with an expected lifespan of 5 years. This means it needs to earn about 2k per year. Depending on how much power it uses, it may well be that 50% utilization with only cheap electricity is still less profitable than 100% utilization with cheap electricity 50% of the time and expensive electricity 50% of the time.
Third, physical servers need maintenance every once in a while. It will be mcuh cheaper to organize that if all the servers are in a few central locations than if they are spread out. (this was originally one of the main selling points of electricity in the first place! You could finally build your factory where the people and the resources were, rather than having to go where the power was)
Finally there are other options to consider which solve the same problem at potentially a lower cost, like building out the grid and introducing more demand response in the industrial base.
When my off-grid system is producing well my Internet router is automatically powered from it, I run more tasks[0] on my off-grid server, charge my phone off-grid, and sometimes run my laptop off-grid. It can be done, but is decidely non-trivial. That's what we have a grid for, with variable per-MWh pricing.
I'm not in the cloud computing business, rather, the utility business. But at a glance it seems quite similar: you invest in capital and make your money a few cents a minute over a relatively long period. Costs are relatively granular and so you get more for more usage.
With that said, don't you want steady, consistent usage selling a server? I'd guess for exactly the same reason as a power plant, nobody wants to own a server that cranks up 2 or 3 hours a day... unless they absolutely HAVE to, to provide their service.
I don't mind owning a server (or a power plant for that matter) that only operates a few hours per day, as long as the price I get paid for those hours is high "enough" compared to the risk on the investment.
The problem is, of course, that at the prices needed for that most consumers will no longer be interested in buying my computing services or power.
> I wonder if servers could be located close to solar or even attached to individual panels. So that during times of excess you sell cheap computing resources. This could reduce demand on the electricity grid at peak times.
That doesn't really make a lot of sense. Servers are too expensive to keep off except in "times of excess".
That being said, it's a long established practice to co-locate data centers with cheap power.
This is indeed the heart of the matter: humans are diurnal.
If we stopped trying to live to an entirely artificial rigid timetable (with some ineffectual fiddling at the edges with daylight saving time) that ignores seasons, weather, etc, solar would be an even better fit to demand.
… only in places where heating is not the lion’s share of energy demand. Solar is a great fit near the equator or in sunny California where cooling is needed more than heating.
It is not great for latitudes north of 45 (much of Europe) where energy needs due to heating increase at night and in winter when solar is not running at capacity…