Discussion in 'Article Discussion' started by bit-tech, 8 Nov 2018.
£600...for a '70 class GPU, Madness, utter madness.
And when your card requires an anti sag bracket it should tell you immediately that there is something fundamentally wrong with the design.
Is it not too difficult to ask you to include the benchmarks for the 1080 Ti?
You know that gut feeling that tells you the 2030 will be £200, don't you? You try to dismiss it but it never goes completely away.
There's a partner 1080 Ti in all the charts.
My bad, think I'm going blind.... mother did warn me about that
I find both the price and performance of these RTX cards singularly disappointing, largely because of my use-case, I know, but still.
I've wondered for some time why the oldschool supports from the AT days haven't returned. Give cards an official max length, put support slots in every case at that distance, problem solved.
You can't trust AIBs card to actually meet specifications. The Reference/FE cards did (until the most recent, that goes a bit overheight) but those are actually well built enough not to need a support in the first place making it moot.
Or use intelligence when designing a cooler rather than strapping a 1KG+ block to it and calling it a day...
Also a good idea.
Edit: but really, that's a bit of a rock/hard place issue. You can't cool a modern graphics adapter with the kind of weight that an unsupported piece of PCB plugged into a card edge connector is meant to bear, any more than you can power one through that dinky card edge connector. There's better and worse approaches to the issue, but ultimately we need a whole new paradigm for graphics installation. I'd love to see nVidia's mezzanine connector become a standard graphics interface. It won't happen, but it'd be nic .
True, but the vapour chamber tech that Sapphire demoed to great effect on the 3870 Atomic seems to have died a death... and that allowed a fairly hot running dual-slot card to get compressed down into what is a fairly meagre single-slot cooler... and still be cooler than the stock heatsink! I had one, which I got when Play.com were having a blowout sale. I actually thought the fan was broken when I first set it up, because I'd got used to "turn on the PC, immediately cover ears from turbine noise". Was hilarious that it came in a (poor quality) pseudo-flight case, with "Atomic" plastered all over it. Not something you want to travel with now, I fear.
Now, I know it won't magically solve the fact that modern GPUs throw out a lot more heat... but I don't remember seeing any more recent cards really using vapour chambers. Have I simply not been paying attention? Or have patents/royalties stopped it? Or are they not good enough for these modern 250W+ TDP cards?
That's a good question.
My best guess is that it is one of aesthetics and marketing. Plain ol' heatpipes let you put those big honkin' fans all down the length of the card, and that implies better cooling to the purchaser, even if it isn't actually true. I gather that vapor chamber coolers work best with blower fans, which don't look that impressive and have a reputation for being rather noisy.
I wouldn't be surprised if the vapor chamber cards just didn't sell as well as something with a larger and cheaper cooler, regardless of actual performance.
I'm not saying they can't do better than they are doing. I'm sure they can. I'm just saying that there's a lower boundary to what can be done, and doing better is just slapping band-aids on the problem.
Vapour chambers are in relatively common use (every Nvidia reference/FE card since the NVTTM-type cooler debuted back with the OG Titan). The difference is that the 3870 Atomic had what, 150W to deal with? And with a tiny screaming fan that would not pass muster today. The GTX 1070 Katana would be a modern equivalent (similar power, single slot, teeny weeny fan).
Heatpipes are used in preference to monolithic vapour chambers on AIB cards due to cost (MUCH cheaper to bend a few generic heatpipes to the desired shape than manufacture a bespoke vapour chamber and charge it with working fluid) and performance parity except at extremely high power dissipations (high power, or smaller heatsink volume). As consumers seem perfectly happy to buy triple-slot overheight plastic-encrusted glowing monstrosities there is little market pressure on AIBs to actually build smaller more efficient coolers.
I could literally never hear the tiny screaming fan in my 3870 unless I turned off all my case fans and let the system start to gently cook; at which point the fan kicked up to speeds it never otherwise reached. Conversely, I could hear very well the horribly loud fan in my X1900XT and the replacement X1950XTX which was better at idle, but still very loud at boot.
I'll admit I don't remember reading anything about the NVTTM coolers coming with vapour chambers... but it's been a long time since the OG Titan came out or I had time to pore over reviews - normally now I just jump to thermals, power draw and performance in the game(s) I care about... and any compute tests, if present. Otherwise I find I really need to test myself to determine whether something is appropriate for my use-case. Thanks for the tip.
Yeah, or the Quadro P4000 or just announced RTX4000. I'm not happy to buy triple slot monsters, but again, I'm not using these cards for gaming (much). I made the mistake of buying a just-a-touch-over-double-slot design card and then had to play musical PCI-E slots on my motherboard to get it in with various other bits 'n' bobs without anything fouling one of the fans...
Off the top of my head, the reference/FE cards that used vapour chambers are:
GTX Titan, GTX Titan X, GTX Titan Xp, GTX Titan V, GTX 680, GTX 780, GTX 780Ti, GTX 980Ti, GTX 1080, GTX 1080Ti, RTX 2070, RTX 2080, RTX 2080Ti. Others like the GTX 980 use a near identical heatsink assembly, but substitute the vapour chamber for a plate with multiple embedded heatpipes.
Separate names with a comma.