datacenter liquid cooling solution
-
Industrial cooling towers are usually evaporative in my experience, smaller ones are large fans moving air over a stack of slats that the return water is sprayed or piped over and the collects in well for recirculation, larger ones afaik (like what you'd see at power plants) operate the same idea. Top ups and water chemistry is all automated.
Those systems have operation wide cooling loops that individual pieces of equipment tap into, some stuff uses it directly (see that with things like industrial furnaces) but smaller stuff or stuff that's sensitive you'll see heat exchangers and even then the server & PLC rooms were all air cooled, the air cons for them were all tied into the cooling water loops though.
From a maintenance POV though, way easier to air cool, totally seen motor drive racks with failed cooling fans that have had really powerful external blowers rigged up to keep them going to the next maintenance window. Yeah, industrial POV but similar idea.
critical data centers use swamp coolers because they don't have to treat the water or expose it to contamination from outside. they use straight domestic water.... super cheap.
if the conductivity gets too high, they dump the basin and fill with fresh... rinse and repeat.
-
That is awesome! I had no idea swamp coolers could cool that well. The one in my shop can barely drop the temp 10-15 degrees below outside (on a good day). Sorry for doubting you, so used to people outside of arid climates not knowing what a swamp cooler is.
10-15 degrees is all you need to keep a "cold aisle" at 85degf, most places, on the worst day.
IIRC Amazon figured out that individual components could actually run hotter within an acceptable replacement window.
higher equipment replacement is more than offset by the fact they don't have to do refrigerant based cooling which makes daily operation ridiculously cheap... no pumps or complicated mechanical devices to produce cooling... no people with special skills to maintain them, etc.
-
10-15 degrees is all you need to keep a "cold aisle" at 85degf, most places, on the worst day.
IIRC Amazon figured out that individual components could actually run hotter within an acceptable replacement window.
higher equipment replacement is more than offset by the fact they don't have to do refrigerant based cooling which makes daily operation ridiculously cheap... no pumps or complicated mechanical devices to produce cooling... no people with special skills to maintain them, etc.
That 10-15 (in my case) relies on no extra heat. When I have game nights it gets pretty toasty inside with 5 or so extra bodies in the shop.
My last experience with a server room was in 2002 or 2003, and the rooms were kept in the mid to low 60s.
-
That 10-15 (in my case) relies on no extra heat. When I have game nights it gets pretty toasty inside with 5 or so extra bodies in the shop.
My last experience with a server room was in 2002 or 2003, and the rooms were kept in the mid to low 60s.
no doubt.
I remember when you'd put a jacket on before you went in the halls... but now everyone wears shorts.
-
There’s basically no reason ever to do water cooling on a home system unless you’re trying to do overclocking.
Air is cheaper, more reliable, and typically quieter because you don’t need pumps.
Air also does not make things wet when it leaks.
-
a chiller is not a swamp cooler.
picture a fan with a wet sponge in front of it... that is a swamp cooler.
Yea, it's the combo of the chiller and cooling tower is analogous to a swamp cooler. The cooling tower provides the evaporative cooling. The difference is that rather than directly cooling the environment around the cooling tower, the chiller allows indirect cooling of the DC via heat exchange. And isolated chiller providing heat exchange is why humidity inside the DC isn't impacted by the evaporative cooling. And sure, humidity is different between hot and cold isles. That is just a function of temperature and relative humidity. But, no moisture is exchanged into the DC to cool the DC.
Edit: Turns out I'm a bit misinformed. Apparently in dry environments that can deal with the added moisture, DCs are built that indeed use simple direct evaporative cooling.
-
Hi,
I'm building a homelab watercooled unix server.
I don't want to buy expensive overpriced pre-mixes from ekwb or aquatuning.
What cooling solution do datacenters use for water cooling?What is the chemical solution? Does anyone know?
Water cooling is typically much more complex and expensive than air cooling, and is mainly attractive because of space limitations. The same applies to data centers. IBM's mainframes have a liquid cooled version mainly targeted towards users wishing to get the most out of their data center space before upgrading sites. These ship without coolant, and simply ask the user to "just add water," i.e. just demineralised/distilled water.
Sure Mainframe ain’t dead, but what about that toilet water? | Aussie Storage Blog - https://aussiestorageblog.wordpress.com/2021/04/07/sure-mainframe-aint-dead-but-what-about-that-toilet-water/
-
oh man! I just poked ptsf@lemmy.world for a Austria!=Australia flub in another thread... my come uppins!
The cooler is made of lava
Haha, carry on
-
Air also does not make things wet when it leaks.
Is rain a leak?
-
Is rain a leak?
The clouds are leaking.
-
The clouds are leaking.
Ah shit.
-
There’s basically no reason ever to do water cooling on a home system unless you’re trying to do overclocking.
Air is cheaper, more reliable, and typically quieter because you don’t need pumps.
And air doesn’t leak all over your electronics. Well it does but it doesn’t short anything out.