I am looking for a practical solution for several 1U servers for silent cooling. The problem with the servers is that each one has 7 fans in them, 2 for cooling the power supply, and 5 for cooling the processors. When all 6 of the servers are powered on the noise that they produce is skull splitting. the main issue is height. I've looked at several options but have yet to find a good possible solution for this, and is why i'm turning to the pc mod boards. Mind you, the solution needs to be applied to all 6 severs, what i've looked into so far is: Mineral Oil Cooling Water and AntiFreeze cooling Nitrogen Cooling Other forms of Air Cooling The two main issues that i've noted with water and nitrogen cooling is condensation and / or leakage, and is why I looked into Mineral Oil, as a board can be completely submerged in mineral oil and continue to function properly without shorting out, Mineral Oil is also lighter based oil and could be pumped easier like water would, the only problem with mineral oil when done through a pumping system like water cooled system, it doesn't transfer the heat out as good as the water based system. (http://www.pugetsystems.com/mineral-oil-pc.php - mineral oil cooled pc) Nitrogen produces condensation, so its abit more difficult to use for a pc based system from what i've been seeing looking through the systems that the users have created with nitrogen. So... here I am, looking to you guys now for a possible practical solution for the cooling system. knowing modders can come up with some of the best solutions for complex spacing issues. I dont know if these documents will help you any but here: Manual: http://h10032.www1.hp.com/ctg/Manual/c00083252.pdf Specs Sheet: http://h18000.www1.hp.com/products/quickspecs/11910_div/11910_div.HTML
Phase change? Not exactly quiet though. I've never been in a quiet server room in my life, so I think move them to the shed is the best solution too!
The only practicle way to silence 1 U servers is to either use passive mini ITX motherboards or to not use 1U chassis. They are designed to be used in temperature controlled rooms where noise is not an issue.
Same - our server room here is ridiculously noisy (& air-conditioned) - would be interested to hear if your mineral oil idea comes to fruition!
So are the R&D teams who have a hand in designing these servers. It's just not gonna happen - they need performance cooling, and performance is LOUD.
LennyRhys, What are you running on these servers? Can you not buy one bigger more resourceful server, stick VMWare ESXi on it and virtualise the lot? that would save you lots of electricity, noise and heat.
change your rack for a sound proof one, or add sound proofing to your current one look at this 9u sound proof cabinet
As other said, solutions are : 1) don't use 1U 2) get a climatized, sound proof room and put your servers there 3) don't use 1U And the last alternative - don't use 1U.
Server System Routing: this is the server layout, and how it basically works for the unit, some things have been removed from the list. The servers reside in the basement of the structure. I'm going to guess that many of you haven't been to very many server sites, so, so while the majority of them are 'loud' there are many of them that aren't 'loud' either, such as the U.S. Border Patrol, some law enforcement, and others. I know because i've been in these server rooms doing replacements before (border patrol is the least friendly i've ever been in). Now while i can't discuss in detail what resides in those rooms, I can say that they do have many 1U units, and they are pretty silent running. far as performance, isn't that a issue or concern you take into consideration when building a modded pc? I'd think you'd want the pc to be as cool as possible with as little noise as possible so that your not continually replacing parts in your modification. I've seen many mods using 'standard' boards where space was tight and yet they still managed to keep a low noise threshold for the machine itself. I know when i wire a desktop, I attempt to get the best airflow in the system as possible so that I can make the system keep cool and run well without making it sound like a 747: and yes that board has all its power wires, drive cords and front leds all hooked up. Now I know their are low profile water cooling units for the CPU's that you can get that would fit within a 1U server case such as the image below, but I was hoping to get some better ideas from the modders then "You Cant", honestly didn't even believe that this was part of your vocabulary... low profile CPU water cooling: http://www.frozencpu.com/products/1...ile_-_Sockets_77511561366.html?tl=g30c325s844 ...Frankly I'm not really willing to give up with a 'you cant' either. I know you 'can' its just finding a working solution.
http://www.thermaltakeusa.com/Category.aspx?S=1359 thermaltake has multiple 1U solutions available for cooling, air cooling, for anyone that might be interested... But I'd like to look into other possibilities besides air cooling. I might just have to go ahead and plan out a water/liquid cooled system, grab a few of these servers off of ebay (so i dont fry the main ones) and do some testing to see just what I can achieve.
Squirrel cage fans. If you can find one long enough to replace the row of 40mm you're gold. I worked at a place briefly which we fitted these into all of the 1u racks. They had a very similar design where the fans were placed in the mid section with a duct. Being near them was much nicer afterwards. One big fan was much nicer for noise and offered much better cooling. I believe they cost something around $20 each.
could standard water cooling not be a perfect solution here? a row of large, externally mounted radiators would allow for quite, efficient cooling of the hardware....
be nice if my posts actually got posted when i wrote them... rather then them showing up 4-12 hours later.... uggg.... the woes of being a new user on a forum... concept idea for water cooling: the large boxes on the sides would be the pump housing. the cooling liquid would be pumped through all six units, and through all 6 reservoirs so that the fluid that started at unit 1 would pass through unit 2-6 before it reached unit 1 again. just one idea... cheap 3d rendering but its done to scale, I threw it together quickly.
Maybe i'm stupid, but AFAIK no server racks offer you the space where you put the external cooling. Plus this setup kind of defeats the purpuose of the rack system - the fact that if something goes wrong, you take out the wrong unit, put in a new one and you are back with a minimal downtime.
AFAIK, unless the rack is completely full (which sounds unlikely in this case) theres usually a large empty spot near the bottom. this could be used to house rads and pumps. some quick release fittings mounted in the back of each unit would minimize hassle when it comes to servicing.
To be honest, you're probably better off just putting up with the noise. If this is a home installation, I would recommend searching for some 2U Chassis at minimum, 3U preferably. Apparently These Are Quiet: Scythe Mini Kaze Ultras Though I don't know what the retailers a like.
The Chassis / mainboard for the Proliant DL145 are proprietary, so switching them to a 2U case really isn't a option, unless I could find a 2U case that is totally Uncut for both the front and back paneling (with exception for the addon slots) so that I could mark and cut the needed rear hookups accordingly, as well as the front hookups. Same with switching to a standardized PC format Mainboard for the units, cost beats purpose, as I would still have to devise a cooling solution and buy new power supplies on top of that. The only usable parts from the Proliant servers would be 1) CPU's 2) Memory 3) Hard Drive 4) Video Card All other items would have to be replaced. If I were going to go to that length, then why not just replace each server instead, old internal hardware tends to lead to failure when mixed with new internal hardware. Far as the Rack itself. Not worried about the rack, and yes your correct most racks do not allow for side space for storage, there are a few out there that do, however their in the 5000 range. Space for the water tanks / cooling is available in the area the servers sit, so this isn't a issue, and holding trays / shelves could be mounted to the side of the unit to hold these items. Far as defeating the purpose of being able to 'quickly remove' it wouldn't really. you'd just have two additional cords basically to remove from the back end of the system. "But its water cooled you'd have to drain all of them", actually we thought about that, and shutoff / release valves could be added to the system that would allow for quick release to remove a system as well as well as shut off the water. If one wanted to get a little more advanced with it, one could add flow control, so that when you shut off the flow to one unit, the flow for the additional units is redirected so that the pattern continues uninterrupted.
that last paragraph is pretty much what i was thinking. i still think its the best option for you. there are quick disconnect fittings that automatically shut off both ends when disconnected. this eliminates the need for extra valves and extra steps when disconnecting a unit. it also minimizes the amount of air that gets into the system each time a unit is removed from the loop. running all the servers in one big parallel loop would allow you to "hot swap" units without shutting down the pump(s) and thus the flow to the other servers on the rack. just keep in mind that in order to get any benefit from water cooling your going to need a considerably large surface area to dissipate all that heat without requiring noisy, high speed fans.