Okay, so I can report that the hack to get SLI working on a non-SLI board works, however it does not work with the latest nvidia drivers. I am currently using v260.99 without issue, though I must state that I am currently running without an SLI bridge. First Vantage run gave me a score nearly double of what my single GTX285 produced. All settings were at default, and as soon as I get the SLI bridge I have ordered I will make some proper benchies at various levels of OC and post the results here. There is some discussion as to what perfomance hit you take by not using a bridge, but I guess I will find out. Now though, I must game.
It was truly fun playing games last night with what has effectively become something like a new rig. The next step is obviously overclocking the graphics cards, and thus I need to find a water block for that second card quickly. As I inspected the cards using NVIDIA inspector, I found that my new card (ASUS Matrix GTX285) runs a newer BIOS than my old card, and it also run its GPU at a higher clockrate (662Mhz vs. 648Mhz). So I set my old card at the same speed as the new one, and left it at that as I at that point had to retire for the night. If I find time to go play with myself the rig tonight, I will update the BIOS on my old card (and possibly the new one in case there is a newer BIOS around) and see how far I can push the clocks with one of the cards on air. The SLI bridge I ordered will hopefully be delievered saturday, and then we will see how much of a difference it makes. With one GTX285 at default speeds I scored just north of 14k in Vantage. With GTX285 in SLI at default speeds I scored just shy of 23k in Vantage. It'll be interesting to see how much more I can get from the cards.
I now know the difference of running SLI with the SLI bridge as compared to without one. Yes, I finally received my long-awaited SLI bridge (it was shipped alone in a 40x25x15cm box - insanely wasteful!). Without the bridge my two GTX285s gave me a score of just shy of 23k in Vantage, whereas with a bridge it gave me a score just shy of 24k in Vantage at default clocks. So, in other words, it does make a difference - not much of a difference, but a difference nonetheless. Now it's time to up the clocks a bit.
im amazed at how little difference the bridge makes...almost seems pointless. i suppose on more powerful cards it would have a bigger impact (more power to fill the available bandwidth). any luck on a second water block?
Probably, and at least if you go tri-SLI. As for a second water block; no luck so far. The hunt continues.
I have finally found, and purchased, a second EK waterblock for my second 285. I never was able to find a second hand one cheap, but I lucked out in that EK on their own webshop are now selling these blocks at reduced prices. Unfortunately they were out of stock on the acrylic/copper version, so I bought the acetal/copper variety. All in all I paid the equivalent of roughly £46 including shipping which isn't too bad. I expect the block to be delivered some time next week.
Sometimes these things are not always as easy as you imagine them to be - I'm sure you have all experienced something similar. Silly me did not do sufficient research before purchasing my second GTX285, and as such I bought an Asus Matrix GTX285 which I got rather cheap. Thinking that adding a waterblock to this second card would be as easy as, well, adding the waterblock to my first card, I just set about acquiring the needed WB. However, it turned out that the Matrix used a non-standard PCB, so no full cover block would fit to cool all necessary parts. After much cursing and pulling of hair I took a gamble and got an EK rev.1 block thinking that at least the copper part would fit, and then I could either modify the aluminium extension or simply leave it off. This is where things got interesting. So, it turned out the rev.1 EK block did fit, albeit only the copper part as suspected. However, it was a really tight fit, and certain things would need to be modified in order to make it work. In the picture above you can see that I have taken the original cooler off the card, and the EK block has been mounted. There is also a small heatpipe cooler sitting next to the card, and though originally marketed as a mobo mosfet cooler I have found another use for it. This next picture shows how tightly the block fits; it's just fractions of a millimetre away from shorting out those components. Also, below are a row of small chips which will require cooling. This is where I plan to put the heatpipe mosfet cooler to work as it looked like it would fit nicely. This is what I have in mind, but the mosfet cooler's coldplate will need to be resized a bit in order for it to work. In its original size and form it would sit on top of a row of resistors, and not really accomplishing anything useful. Also, as the above image shows, the EK block did need to be modified slightly as well. A small protrusion interefered with a component on my Matrix (or maybe the same component was set at a different height on my Matrix than the reference GTX285), so I had to file that down. The above image was taken after this was done, and now it fits nicely. In this picture you can see where I will need to cut the mosfet cooler's coldplate for it to fit the row of components requiring cooling. Out comes the dremel, and a few minutes later we have a properly resized heatpipe cooler to go onto my Matrix. Unfortunately I did not take any pictures during the rest of the fitting process, but I also found out that the Matrix is slightly taller than the reference GTX285, so I had to grind off a portion of the barbs so that the card would fit with the barbs in place. Of course I could have mounted the barbs on the other side of the block, but that would be less elegant, and it would still require me to grind off parts of the plugs that go in the other holes on the block. That last image up there shows the completed assembly. I glued the heatpipe cooler to the card using Arctic Silver Thermal Adhesive, and the block was tightened using the Matrix backplate and the standard screws that accompanied the EK block. If you look closely (and perhaps squint a little) at the picture above, you can see where I had to grind off bits of the barbs in order to fit the card to the block. Nothing major, but I was concerned that this would affect the water seal. A couple of close-ups of the heatpipe cooler glued to the card. In the last image you can just spot the row of resistors which required me to resize the coldplate. And finally a glory shot of everything mounted in my case and connected to the loop. Unfortunately I had to remove my X-Fi in order to make this fit, but I will reinstall it using a PCI extender whenever I can get my hands on one. I have now run this system for a few days, and so far I have no leaks. It is perfectly silent apart from my multitude of drives spinning, and I am getting excellent temps all over. Until next time. Ta-ta.
your out of tea all jokes aside, looks good and im glad you finally got it set up the way you wanted it. seems like a lot of work to get the block installed but the end results are A+.
Thanks mate, and it sure wasn't easy finding time to do it all considering the twins. Oh, and by the way; that was an espresso...
Well, the GD70 has been retired and will, as soon as Bulldozer ships, be turned into a server. I have replaced it with an ASUS Sabertooth 990FX, which I am very happy with this far, though I did get quite the scare as I installed it. I began working on installing the new motherboard friday night (after a long day at work, little sleep and a few glasses of wine). As you may see from this thread my WC setup is not easily disassembled, and as such takes quite an effort. Additionally the PCIe slots are not distanced equally to the GD70, so I had to redo much of the internals. After all this, getting the mobo installed, and reinstalling all the components I made a newbie error. I forgot to tighten one of the hose clamps - the "input" hose to the first GTX285. So as I started the pump for leak testing, obviously water gushed out of the hose, dripping off my graphics cards. Naturally I turned the pump off right away, but due to the difference in height between the water tank and the computer, water kept flowing, so I had to tighten the clamp while the water flowed in order to stop the leak. Obviously the computer was not powered at this point, but you can imagine the sheer terror I experienced at this point. So, after wiping up about a liter (yes, no exaggeration) of water/antifreeze mixture from the floor, I tilted the case towards me and watched as water poured out from between the GFX cards and their cooling blocks. At this point I wiped it up as good as I could, and went to bed. I left it sitting there to dry out until yesterday evening. Then I took the whole thing apart - again. There are several steps involved in doing this. Step 1 - emptying the loop To do this, I've found the most efficient way is to use a foot operated air pump (which I also use to top up the wheels on our twins' stroller), insert the air hose into one of the water feed hoses outside and force air into the system. This gets most water out, and equalizes the pressure so the difference in elevation is negated. Step 2 - emptying the internal loop To do this I have to disconnect the water hose from the copper pipes, and again I blow into the hose to clear out most of the rest of the water in the loop. This time I'm indoors, so I just blow the load onto a towel... Step 3 - disassembly This is done in the normal way... After taking everything apart, I took the blocks off the GFX cards to inspect the damage. There were still water droplets present, but most of it had either evaporated away or been drained away during my initial efforts. I wiped the rest away, gave it a once-over with the missus' hairdryer and stuck the blocks back on. Then I put the whole thing back together, triple checked all the hose clamps, and turned the pump on. No leaks. Then I plugged everything in and pressed the power button. It turned on, the drives spun up, LEDs flickered, but there was no signal to the monitor. Something was wrong. Had I fried a GTX285? A few tense minutes followed, but then I noticed the DRAM LED stayed lit, indicating something was off with the RAM. So I used the RAMOK button, and voila - life. Long story shorter; it works. Everything still works. Phew. And I am looking forward to the day I may finally take possession of an 8150 - because then I have to do the whole thing all over again...
I'm thinking of doing a very similar thing . . . http://forums.bit-tech.net/showthread.php?t=216565 One question that has come up, will condensation be a problem? have you suffered from this?
Yes, as soon as you hit sub-zer temperatures condensation will form along your coolant pipes. I solved this by insulating the pipes with some form of pipe cladding, and wrapping the blocks with neoprene. Furthermore, I turn the radiator fans off when we get down to negative numbers outside. Even so I have recorded double digit negative coolant temperatures while the system was running. Yours seem like an interesting system, I will be watching.