1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Nvidia GeForce GTX 580 to launch soon?

Discussion in 'Article Discussion' started by Claave, 29 Oct 2010.

  1. Teelzebub

    Teelzebub Up yours GOD,Whats best served cold

    Joined:
    27 Nov 2009
    Posts:
    15,796
    Likes Received:
    4,484
    The GTX480's are not as bad as they are made out to be anyway mine dont run very hot at all.
     
    Last edited: 29 Oct 2010
  2. Jasio

    Jasio Made in Canada

    Joined:
    27 Jun 2008
    Posts:
    810
    Likes Received:
    13
    So when can we expect the new "SLi" Ready 2500 watt power supplies to feed these hideous things?

    Sorry nVidia - AMD's going to stay on top for a few more quarters.
     
  3. Telltale Boy

    Telltale Boy Designated CPC Jetwhore

    Joined:
    3 May 2010
    Posts:
    982
    Likes Received:
    43
    I'm pretty sure the mountain mods case must play more than a small part in that. ;)
     
  4. ssj12

    ssj12 Member

    Joined:
    12 Sep 2007
    Posts:
    686
    Likes Received:
    1
    Last time I checked, isnt the 69xx GPUs supposed to run as hot and use as much power as the GTX480/580? If true, the argument is null and of the GTX580 outperforms it, nvidia basically wins.

    Still truthfully I have my GTX480 running at 50c now running F@H on air so... ya...its actually not that hot of a GPU in the first place.

    As for a purchase, I wish there were new AM3 SLi motherboards so I could have an AMD CPU and dual GTX480s. So I'm going Intel next upgrade so I can't afford a GTX580. Just maybe another GTX480.
     
  5. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    There's a couple different things to go and muddle all of that up:

    -The 299W GTX480 is 17% more than the rumoured 255W 6970. If the GTX580 uses the same 299W then it will need to perform a quite large 17% better to get the same performance per watt.
    -Maximum TDP isn't a perfect measurement of heat. Ambient temperature, case design, cooler design, and any number of other smaller factors can widly change the same component's temperature depending on its environment. Your own report of 50C could be in a well air conditioned room, or a case with above average airflow. You could have a non-reference cooler design which performs much better. Perhaps F@H isn't pushing it as hard as other tests which have shown the card to get hotter. Temperatures really only apply when such variables are accounted for.
    -The entire factor of cost. The two primary factors for most video card buyers are performance and price. Power draw, thermals, and noise are all secondary. Extra features that help buyers make a decision when the primary two factors are too close.
     
  6. Teelzebub

    Teelzebub Up yours GOD,Whats best served cold

    Joined:
    27 Nov 2009
    Posts:
    15,796
    Likes Received:
    4,484
    They didn't run much hotter when they was in the 830 stacker TBH, They was maxing in the mid 70's certainly not hot enough to cremate my granny.

    Of course they would get hot when running the bench haven mid 80's but who sits and runs benchmarks all day hardly real life usage is it.
     
    Last edited: 29 Oct 2010
  7. dangerman1337

    dangerman1337 Member

    Joined:
    2 Sep 2010
    Posts:
    256
    Likes Received:
    5
    You people are forgetting that TDP =/= Actual power consumption.
     
  8. fingerbob69

    fingerbob69 Member

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    Wow ...the nVdia fanboi's are all over this! lol

    Seriously; this was on Fud TWO days ago, and we all know how reliable he can be... right?

    If Bit tech reckon this card to 11.5" from the 'pixelised' image then the only people with a rig big enough to house it is the Democratic Peoples Republic of China
     
  9. Kúsař

    Kúsař regular bit-tech reader

    Joined:
    23 Apr 2008
    Posts:
    317
    Likes Received:
    4
    We can expect yet another epic battle between powerful (hopefully and finaly) optimised nVidia GPU and two ATi power efficient GPUs(wasn't announced yet, but I bet it'll be out before the end of year).
    Bring it on! I could use some cheap upgrade :D
     
  10. steve30x

    steve30x New Member

    Joined:
    31 Aug 2008
    Posts:
    93
    Likes Received:
    0
    My 800D will house a GPU that size
     
  11. drunkenmaster

    drunkenmaster New Member

    Joined:
    5 Jan 2003
    Posts:
    48
    Likes Received:
    0
    Why would a 6.5% shader increase(480 to 512) plus very generously 10% clock speed bump, give 30% more performance? Answers on a post code, if indeed it is just the fully enabled 512sp cards they've been building up stock of, however, if thats the case most likely they'll be INSANELY expensive and in incredibly short supply, much like the 285gtx last year, not that they couldn't make large numbers of those, after they EOL'd it price went up to pretty much match the 5870, because that way the few that are left stay on shelves as no one wants a 285gtx for £300 when you can get a 5870 for £300, or a 5850 for £200 which is significantly faster than it.

    I expect most likely the same, short supply of whatever it is, massive price, but they can keep a few on shelves to make it look like all is fine.
     
  12. Skiddywinks

    Skiddywinks Member

    Joined:
    10 Aug 2008
    Posts:
    930
    Likes Received:
    8
    This raises one large question;

    Where the **** have you been checking? 'Cause it certainly isn't BT, or anywhere else where they know what they are doing. See for yourself.

    nVidia have produced a massive, hot, power hungry chip that is considerably underpowered when compared to what ATI have managed to achieve within much tighter limits. Hell, unless the 580 has a true architectural change, we will only being seeing what they originally delayed all those times from over a year ago. The fact that people can still love nVidia is madness. It is likely that this is simply the Fermi that nVidia were boasting about all that time ago. The card we were meant to get.

    Of course, there is an argument for simply having the best single GPU on the market, even if it is only by not so much. But anyone who thinks nVidia did anything other than trip over their own arrogance this round, especially after they knew what to expect after the shock of the 4 Series from ATI, is out of their mind.

    I
     
  13. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
  14. fingerbob69

    fingerbob69 Member

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    Actually, I expected better of Bit-tech readers:

    The gtx480 was a castrated 512 part because fermi512 shaders simply couldn't be made; too hot and too few. Google it if you doubt me.

    So what we are now expected to welcome, 8 months after gf100 is son of gf100 ...gf110... gf100AsItWasMeantToBeOnlyWe****edUpEnjoyItNow!

    or gf100-AIWMTBOWFUEIN

    (I would just point out that AMD took 12 months to come up with their 6xxx series development of their successful 5xxx series while we are now to believe that nVdia have righted all wrongs in their current series in just 6 months... am I wrong to ...doubt?).
     
  15. ssj12

    ssj12 Member

    Joined:
    12 Sep 2007
    Posts:
    686
    Likes Received:
    1
    You do know you just pointed me at mid-range 68xx cards right vs high-end GTX4xx cards in power consumption right? The only comparison between your argument against mine is that the GTX460 uses like 10w more power to run which makes no sense anyways since I was speaking of the 69xx cards which have the estimated TMP of 255w (which really it will be like all cards and go somewhere north of that by at least 20w).

    At least Sloth gave me a real response that made sense to my comment.

    @Sloth, you have a point, but since the GTX580 is expected to be between 15 - 25% stronger then the GTX480 I think what you said is possible and it will be well worth it. Even if it lands at only 16% stronger versus your stated 17% that is well within margin (and a bit of OCing can solve that percentage difference anyways.)
     
  16. glaeken

    glaeken Freeeeeeeze! I'm a cawp!

    Joined:
    1 Jan 2005
    Posts:
    2,041
    Likes Received:
    50
    I'd tend to agree with you if you're looking at the Fermi from just a gaming perspective. However, Fermi is not just a gaming GPU, it was designed to be much more general purpose than AMD's 5/6 series. Fermi is a researching beast. It simply blows AMD out of the water (in hardware and software) when it comes to GPGPU applications.
     
    Last edited: 30 Oct 2010
  17. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Because that is what the majority will be buying a £300 - £400 GPU for, yes?
     
  18. glaeken

    glaeken Freeeeeeeze! I'm a cawp!

    Joined:
    1 Jan 2005
    Posts:
    2,041
    Likes Received:
    50
    The majority of what? Gamers? Perhaps not. Factor in researchers and businesses/corporations and then the majority are going with Nvidia/Fermi for GPU based HPC applications. And this is where the money lies.
     
  19. fingerbob69

    fingerbob69 Member

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    The 69xx is yet to be released/pictured/benched so as to it running " as hot and use as much power as the GTX480/580" ...absolutely no one who can comment, you included, can knowledgeably comment.

    And you've got a 480 to run at 50c flat out? How?
     
  20. mute1

    mute1 New Member

    Joined:
    16 Sep 2009
    Posts:
    124
    Likes Received:
    2
    Except they won't be buying the Geforce versions of the GPU, will they, since they are for gamers. They will get the professional versions - the one with ECC and all the other necessary stuff - which cost a lot more anyway.
     
Tags: Add Tags

Share This Page