Discussion in 'Article Discussion' started by Claave, 29 Oct 2010.
The GTX480's are not as bad as they are made out to be anyway mine dont run very hot at all.
So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?
Sorry nVidia - AMD's going to stay on top for a few more quarters.
I'm pretty sure the mountain mods case must play more than a small part in that.
Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.
Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.
As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.
There's a couple different things to go and muddle all of that up:
-The 299W GTX480 is 17% more than the rumoured 255W 6970. If the GTX580 uses the same 299W then it will need to perform a quite large 17% better to get the same performance per watt.
-Maximum TDP isn't a perfect measurement of heat. Ambient temperature, case design, cooler design, and any number of other smaller factors can widly change the same component's temperature depending on its environment. Your own report of 50C could be in a well air conditioned room, or a case with above average airflow. You could have a non-reference cooler design which performs much better. Perhaps F@H isn't pushing it as hard as other tests which have shown the card to get hotter. Temperatures really only apply when such variables are accounted for.
-The entire factor of cost. The two primary factors for most video card buyers are performance and price. Power draw, thermals, and noise are all secondary. Extra features that help buyers make a decision when the primary two factors are too close.
They didn't run much hotter when they was in the 830 stacker TBH, They was maxing in the mid 70's certainly not hot enough to cremate my granny.
Of course they would get hot when running the bench haven mid 80's but who sits and runs benchmarks all day hardly real life usage is it.
You people are forgetting that TDP =/= Actual power consumption.
Wow ...the nVdia fanboi's are all over this! lol
Seriously; this was on Fud TWO days ago, and we all know how reliable he can be... right?
If Bit tech reckon this card to 11.5" from the 'pixelised' image then the only people with a rig big enough to house it is the Democratic Peoples Republic of China
We can expect yet another epic battle between powerful (hopefully and finaly) optimised nVidia GPU and two ATi power efficient GPUs(wasn't announced yet, but I bet it'll be out before the end of year).
Bring it on! I could use some cheap upgrade
My 800D will house a GPU that size
Why would a 6.5% shader increase(480 to 512) plus very generously 10% clock speed bump, give 30% more performance? Answers on a post code, if indeed it is just the fully enabled 512sp cards they've been building up stock of, however, if thats the case most likely they'll be INSANELY expensive and in incredibly short supply, much like the 285gtx last year, not that they couldn't make large numbers of those, after they EOL'd it price went up to pretty much match the 5870, because that way the few that are left stay on shelves as no one wants a 285gtx for £300 when you can get a 5870 for £300, or a 5850 for £200 which is significantly faster than it.
I expect most likely the same, short supply of whatever it is, massive price, but they can keep a few on shelves to make it look like all is fine.
This raises one large question;
Where the **** have you been checking? 'Cause it certainly isn't BT, or anywhere else where they know what they are doing. See for yourself.
nVidia have produced a massive, hot, power hungry chip that is considerably underpowered when compared to what ATI have managed to achieve within much tighter limits. Hell, unless the 580 has a true architectural change, we will only being seeing what they originally delayed all those times from over a year ago. The fact that people can still love nVidia is madness. It is likely that this is simply the Fermi that nVidia were boasting about all that time ago. The card we were meant to get.
Of course, there is an argument for simply having the best single GPU on the market, even if it is only by not so much. But anyone who thinks nVidia did anything other than trip over their own arrogance this round, especially after they knew what to expect after the shock of the 4 Series from ATI, is out of their mind.
AMD have big GPU's too? http://www.rage3d.com/board/showthread.php?threadid=33970194
Actually, I expected better of Bit-tech readers:
The gtx480 was a castrated 512 part because fermi512 shaders simply couldn't be made; too hot and too few. Google it if you doubt me.
So what we are now expected to welcome, 8 months after gf100 is son of gf100 ...gf110... gf100AsItWasMeantToBeOnlyWe****edUpEnjoyItNow!
(I would just point out that AMD took 12 months to come up with their 6xxx series development of their successful 5xxx series while we are now to believe that nVdia have righted all wrongs in their current series in just 6 months... am I wrong to ...doubt?).
You do know you just pointed me at mid-range 68xx cards right vs high-end GTX4xx cards in power consumption right? The only comparison between your argument against mine is that the GTX460 uses like 10w more power to run which makes no sense anyways since I was speaking of the 69xx cards which have the estimated TMP of 255w (which really it will be like all cards and go somewhere north of that by at least 20w).
At least Sloth gave me a real response that made sense to my comment.
@Sloth, you have a point, but since the GTX580 is expected to be between 15 - 25% stronger then the GTX480 I think what you said is possible and it will be well worth it. Even if it lands at only 16% stronger versus your stated 17% that is well within margin (and a bit of OCing can solve that percentage difference anyways.)
I'd tend to agree with you if you're looking at the Fermi from just a gaming perspective. However, Fermi is not just a gaming GPU, it was designed to be much more general purpose than AMD's 5/6 series. Fermi is a researching beast. It simply blows AMD out of the water (in hardware and software) when it comes to GPGPU applications.
Because that is what the majority will be buying a Â£300 - Â£400 GPU for, yes?
The majority of what? Gamers? Perhaps not. Factor in researchers and businesses/corporations and then the majority are going with Nvidia/Fermi for GPU based HPC applications. And this is where the money lies.
The 69xx is yet to be released/pictured/benched so as to it running " as hot and use as much power as the GTX480/580" ...absolutely no one who can comment, you included, can knowledgeably comment.
And you've got a 480 to run at 50c flat out? How?
Except they won't be buying the Geforce versions of the GPU, will they, since they are for gamers. They will get the professional versions - the one with ECC and all the other necessary stuff - which cost a lot more anyway.
Separate names with a comma.