Like they could cool their shit, and desalinate water with the waste heat. Provide water to dry areas. Like Baja or Texas. Bonus points if they could run off renewables. Seems like a win win
oh don’t worry in 10 years the ocean floor will be full of DCs
Salt water is corrosive.
Yeah. If you live close to the beach, iron stuff rusts quite quickly.
Great lakes then?
Microbial corrosion
No. Nononononononono.
Radiohead?
Just because something generates heat doesn’t mean that it can boil something. In this case heat pumps would still be needed to concentrate it anyway.
Because salt is corrosive, and the real estate is expensive.
Why not build in the cold north? Snow, ice breaking stuff, more expensive construction and work.
There’s a common misconception that these data centers are so big to literally suck up all resources… that’s not it.
It’s just corpos cheaping out.
Why the desert? Because evaporative cooling is cheap as heck, and low power, and works best in dry air. And the land is cheap. And grid energy is cheap.
Why local power plants and generators? Because it’s cheaper than grid energy; it cuts out the middle man. And it increases reliability. Not because there’s literally not grid capacity.
Hence, you are onto a thermodynamically interesting idea. The waste heat could be a “preheater” for desalination.
But of course they are not going to do that: it would cost more money.
Nor would they hook up the waste heat to local communities. Why would they pay to do that and extend construction time?
Also, as a counterpoint, osmotic desalination (which requires no heat) tends to be cheaper anyway, but is still a very, very expensive water source.
The boiling method is used when there are industrial processes that generate a lot of waste heat. You can make it reasonably efficient by taking the heat away on the cooling side and recirculating it back to the hot end.
But yes, datacenters don’t really generate enough heat for that to work without heat pumps concentrating it. All your other points stand.
Speaking of the north, the answer is yes. You totally can, and should, use the heat for something like district heating.
One thing I will say is here in the desert, we don’t actually want them either. Water is already an issue. Power costs are already an issue when you’re cooling your house all summer/heating it all winter. Data centers provide minimal jobs for the amount of resources they use in a community and the downsides have been discovered in a number of places around the country too (ranging from noise to increased costs to resource shortages). Keep your data centers off our cactii!
It would be fine if they developed solar, used closed loop or geothermal systems, distributed waste heat and such as compensation. It honestly wouldn’t be a bad plan compared to other places, seeing how the copious sun, dry winters, and still relatively cheap land would be great for operations.
But no, they only want the absolute cheapest route out there.
They dont even need to hook up and construct a geothermal heat system for a community, either. There are giant sand heat batteries in norway to store excess heat that they then tap into their community heat systems.
These data centers could be responsible to build the giant sand battery and then be done with it, leaving the distribution to the municipality or state, but they aren’t even inclined to do that.
It sounds a little complex in a desert because (AFAIK) data centers produce relatively low-level heat, and in the summer the inlet side would need to be cooled significantly. This sounds like less of an issue in Norway with relatively low average temperatures.
The medium would be cheap as heck though.
Another confounding factor is the necessity of water cooling. I think data centers like evaporators because they can use dirt cheap and standard air cooled servers, and simply A/C the room with the evaporators, where more complex systems would need larger air heat exchangers and a well below ambient return.
Nahh, there’s too much power density in modern servers for air cooling. They have either closed or open loop water cooling that simply dumps the heat in to more water. Or worse, just dumps the previously potable water that was used to directly cool the servers out in the open to evaporate.
Fuck man… I’m so over capitalism. Shits just exhausting…
Exactly why every idiot that defends capitalism as “the best” basically by definition either doesn’t understand capitalism and economics in general, or is a hateful greedy shitstain that couldn’t do something good for humanity if they tried.
The answer, as always, is profit.
Short term gain, specifically.
They want the data center up and cheaply built to make next quarter look good, not lower their costs long term.
Money. The answer is always money. If it’s cheaper to build on land they will.
The answer would probably to make a special tax that force them to move to more environmentally friendly locations.
It’s access to electricity and cost of land. People want to live near oceans, so it’s usually more expensive. If you can get dirt price land in Wyoming that has a power plant near it with capacity, you have most of what you need.
So fun fact: Microsoft played around with the idea of Underwater Data Centres and experimented with one off the coast of Orkney.
Here’s their in-house article about it, apparently they were pleased with the results.
You may not have heard, but the ocean level is rising, not only posing a flooding risk, but more importantly drastically increasing the severity and frequency of water-based natural disasters right at the coast. So, okay, still flooding; but wind too, and sometimes circular wind.
Your DC needs to be in Nevada mountains with the salt mines and it needs to provide heat for the homeless at the air-cooling ejection ports, as the prophet William Gibson foretold.
Waste heat recovery is a thing, and the economics usually work out in your favor if the feed material is really hot. If it’s only mildly warm, you’ll need a lot of machinery to concentrate the heat and raise the temperature to a useful level. At some point, the investment just gets absurd and the idea gets scrapped.
Using heat as heat makes the most sense, since there are fewer steps where you lose some of the heat. Theoretically, you could boil water with server heat, but the massive investment is probably the reason why that isn’t happening everywhere. Running reverse osmosis probably won’t work, because you need electricity for the pumps, and converting heat into electricity comes with significant losses.
I wonder if it would be worthwhile to colocate large greenhouses with datacenters. The exhaust temperatures seem compatible with hothouse growing. The heat would still end up in the atmosphere, but at least it could enable growth of fresh local produce first.
It should be a great idea, but I feel like the quantities involved are too vastly different.
I’m seeing estimates of 300kW/hectare (30MW/km² or 77MW/mile²) for heating glasshouses. With individual datacentres frequently confirming multiple gigawatts, the land area required just doesn’t match up.
This is not to say it isn’t worth considering, but it would be a rounding error in the datacentre’s heat output before you ran out of space to build more glasshouses.
There’s a secondary concern of water consumption. You might extend that to ah but what if we could use that to grow the plants too? but the evaporated cooling water out of one of these systems tends to be anything but clean. Maybe that’s a more solvable problem.
Preheat to 60°C using waste server heat, then boil conventionally.
Yes, that helps to lower the total energy cost of boiling the water. It’s better than nothing, but still pretty far from ideal.
Data centers are generally built in dry, geologically stable places with few severe storms and cheap power. Coastal areas typically check none of those boxes.
You nailed it. Every time desert data centers are criticized around here there are several BS explanations. You got the correct answers in one sentence.
Or build in cold climates where they could use geothermal for power and naturally cold air to bring the temps down.
I hear land in Detroit is cheap. As is Northern Canada.
Detroit is not cold in summer. And places that are cold year round are running into issues from global warming and permafrost (I don’t understand the issue, but they exist) without adding local heat.
The main issue with permafrost is instability. If you build on permafrost, when it melts you’ll lose everything that’s not anchored to bedrock. Imagine the ground 3ft below the surface of your street suddenly collapsing, like a sinkhole. At the very least that would ruin the road, the water and sewer pipes, the electricitu and telecom lines, etc. Melting permafrost also releases A LOT of methane, a much more potent greenhouse gas than CO2. (Runaway global warning, anyone?) Lastly, after the permafrost melts, the soil that is left isn’t necessarily suitable for agriculture: it washes away easily and is prone to waterlogging.
Of course, these challenges can be overcome with time and money. It might become worth doing once the current arable land turn into deserts. But the scope of it is huge. We’ll have to invent a whole new type of agriculture.
Detroit is still colder than the Southern US, where many of these data centers are currently being built. Lots of empty land in northern Wyoming and Idaho as well.
Central/south Idaho is where the land is, a lot of it is volcanic rock, and the north is forest/mountains all day, so… less ideal than you might think.
Also no real internet trunks go through the area, so you’d have that to fix too.
Computers don’t produce heat high enough to be useful. You want the output temperature the computer cooling to feel like a warm room. That isn’t hot enough to boil water. It isn’t enough hot enough to do anything useful with.
They run at around 70-80 degrees, and that’s WITH cooling. How is that not hot enough to warm up homes. Or taking showers in.
Computers usually shut down for safety at 100c which is the same temp water boils at. Usually even in data centers though, the temps are kept below 75c as compute efficiency drops as temp rises.
It might technically be possible, but would probably be more energy efficient to keep the servers cool, and desalinate with the ‘saved’ energy using RO filters
On top of that, the return line is usually way cooler than the processors.
Ro vs distilation isn’t nearly as clear cut as it appears. Well designed distillation systems recover most of the heat warming the incomming water (cooling the out water). Ro pumps use more energy than you expect at scale. I’m not going to say one is better but don’t discount either without a full analisys of your situation.
Both Google and Microsoft tried with actual underwater data centers. It looked like it was feasible, but repairing something when it’s deep underwater was not feasible.
Near water, there will be environmental issues – similar problems to factories and nuclear power plants, debris in intakes and warm water affecting marine life.
Basically, the solution is to build more efficient or less compute-intensive software, but that’s not where we all seem to be heading right now.
Microsoft did that afaik
Then just said ‘no’. Because changing hard HDDs Hard
I wonder how much undersea heating this contributes to…
Still better, than the current overground implemetations
Texas isn’t dry except for the desert parts of west Texas. south Texas is a mix of swamp and normal coast, central is just hills, east is Arkansas lite (I live here) and north is the trial version of the mountains it has the southern bits of.