1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Nvidia GeForce GTX 580 to launch soon?

Discussion in 'Article Discussion' started by Claave, 29 Oct 2010.

  1. smc8788

    smc8788 Multimodder

    Joined:
    23 Apr 2009
    Posts:
    5,974
    Likes Received:
    272
    Yeah, but they're still based on the same underlying GPU architecture. Nvidia is far more interested in the HPC market these days since that's where the majority of the money is, so they made a GPU that primarily performed well in GPGPU applications. The consumer GeForce version of the cards, while being what we're most interested in, were more of a secondary priority since the high-end gaming market is much, much smaller.
     
  2. wafflesomd

    wafflesomd What's a Dremel?

    Joined:
    22 Oct 2005
    Posts:
    1,719
    Likes Received:
    23
    Sweet, another high performance GPU to run games that previous gen cards can run maxed at 60+ fps!

    We need some software to actually use all this hardware on the market...
     
  3. general22

    general22 What's a Dremel?

    Joined:
    26 Dec 2008
    Posts:
    190
    Likes Received:
    1
    Two words.

    Paper launch
     
  4. rickysio

    rickysio N900 | HJE900

    Joined:
    6 Jun 2009
    Posts:
    964
    Likes Received:
    5
    Err... Crysis?
     
  5. wafflesomd

    wafflesomd What's a Dremel?

    Joined:
    22 Oct 2005
    Posts:
    1,719
    Likes Received:
    23
    Yes, let's make cards just so we can run the pile of mediocrity that is Crysis.
     
  6. jrs77

    jrs77 Modder

    Joined:
    17 Feb 2006
    Posts:
    3,483
    Likes Received:
    103
    I'd rather see them developing more into energy-savings then to develop cards that only a handful of people might need to play games at 2560x1920 with max details.

    I'm still waiting for a GPU as powerful as a G80 or R580 (8800GTS or x1950) with only some 25 Watt maximum TDP.
    In times of energy getting more and more expensive a PC capable of playing a game like Left4Dead or Call of Duty 5 at 1280x1024 with medium settings shouldn't draw more then a maximum of 100 Watt alltogether at load.

    Hopefully the Llano-thingies can live up to the expected GPU-performance of a HD5650 so that we're getting atleast near that 100 Watt mark while having a somewhat decent performant system.
     
  7. ssj12

    ssj12 Minimodder

    Joined:
    12 Sep 2007
    Posts:
    689
    Likes Received:
    3
    Im basing my assumptions off the scaling the 68xx numbers and AMD's leaked/confirmed info.

    And I cleaned up and reapplied new thermal grease, got an EVGA high-flow bracket, and have a PCI slot fan under it. So it flows 1 - 3c around 50c stock speeds. Still rounded base average is 50c.
     
  8. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    This is something that really needs to be addressed. If we can reach that kind of performance, even X1950XTX performance at incredibly low power consumption rates, we'd have a hell of a breakthrough, although our top end performance would suffer, technology overall woiuld still progress much more favorably.

    If only eh?
     
  9. Grape Flavor

    Grape Flavor What's a Dremel?

    Joined:
    23 Sep 2009
    Posts:
    81
    Likes Received:
    3
    480 is said to run at 90˚- 97˚F, and you got it to run at 50˚C = 122˚F?

    Either I'm somehow not comprehending what you said or you sure did a crap job of "upgrading" your cooling.
     
  10. SlowMotionSuicide

    SlowMotionSuicide Come Hell or High Water

    Joined:
    16 May 2009
    Posts:
    835
    Likes Received:
    20
    Obviously you aren't. 90˚C equals 194˚F, so he did a bloody good job upgrading.

    So good infact, it beats my EK fullcover block hands down, which only manages to keep the card at a measly 70˚C (or 158˚F) under Furmark.
     
  11. zr_ox

    zr_ox Whooolapoook

    Joined:
    5 Jan 2005
    Posts:
    1,143
    Likes Received:
    0
    Well said!

    Crysis has become nothing other than a great benchmark tool. It's 4 years old soon and I really dont get why people are still so eager to find a gpu which plays it well? Chances are you have already played it, numerous times. Would you really want to relive the experience just to see if you still drop below 30 fps from time to time?


    My Ati 4870 plays it with all of the candy turned on, at 1600x1200 without any trouble what so ever, a few dips here and there but nothing to spoil the experience. I added a second 4870 and never skipped a beat.

    To be honest I expect more from what we are seeing from the current generation of cards. There is so little to be gained that upgrading seems pointless. Even with 24" screens if you have a gpu from the last 2 generations then chances are your fine.

    I'm still running an Intel 9550 (@stock), 8GB memory with a 4870 (@stock) and it runs everything I can throw at it. Sure it's got everything to do with my native screen resolution, but I will not be upgrading anytime soon beacuse it's simply not worth it.

    But let'splay the devils advocate and blame to game developers, because at the end of the day if they dont build anything demanding then hardware developers will never be forced to innovate!
     
  12. Mraedis

    Mraedis Minimodder

    Joined:
    5 Sep 2009
    Posts:
    153
    Likes Received:
    0
    You just got your Fahrenheit and Celcius mixed up in the "said to run at" there. ^^

    WHY would they whine about a 'hot card' if it ran at 90 Fahrenheit...
     
  13. Paddy4929

    Paddy4929 NangO-Gamer

    Joined:
    24 Apr 2009
    Posts:
    136
    Likes Received:
    3
    I may get one of these cards pending review from Bit-Tech. Been saving my money for quite sometime so I can upgrade from my GTX 295. My only gripe is the next game I will be buying is COD: Black Ops which will more than likely run perfectly smooth on my GTX 295. Maybe I should save my dough.
     
  14. confusis

    confusis Kiwi-modder

    Joined:
    5 Jan 2006
    Posts:
    2,406
    Likes Received:
    63
    nVidia is only in that position because they pay out researchers (stanford, etc) to program for CUDA and opengl gets second place for development time..
     
  15. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    Well, not MY money, that's for sure. Last time i checked this was a website targetted at consumers, gamers mostly. So nVidia prioritising GPGPU development is actually another reason for me to go ATI.

    About this 580 card... i'll believe it when i see it, but so far it's just nVidia trying their hardest to steal some marketing thunder and keep the fanbois on board before the 69xx series comes.

    Also, if it's just Fermi with 512 cores/shaders/thingies/whatever enabled, it should be called the GTX 490. have we gotten so used to nVidia rebranding their chips that nobody calls them out on it anymore?

    Anyway, since it's nVidia, see first, believe later.
     
  16. fingerbob69

    fingerbob69 Minimodder

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    "But let'splay the devils advocate and blame to game developers, because at the end of the day if they dont build anything demanding then hardware developers will never be forced to innovate!"

    Isn't the problem the over way round? The hardware is moving on, performance up 10-20% pa while games are released that still in DX9 while other software has to be 32 and 64 bit compatible and run in XP?

    Maybe someone BIG, like a M$ or Valve to say that they're leaving some of these relics behind which would drive software innovation and of course, hardware sales.
     
  17. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    So it's pure speculation then? Exactly the point I was making.
     
  18. Telltale Boy

    Telltale Boy Designated CPC Jetwhore

    Joined:
    3 May 2010
    Posts:
    989
    Likes Received:
    44
    The point I was making was that while it is speculation, it isn't completely unfounded - there is a cause for concern. And it definitely wasn't speculation based on the picture as your post seemed to suggest.

    As they made similar speculation in the article, I don't see why we aren't allowed to speculate too.
     
  19. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. What's a Dremel?

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    whether your a Nvidia loyalist or an AMD loyalist either way I don't see the GTX 580 running faster than two GTX 460 in SLI or two HD 6870 in CF so what's the point unless you plan on buying two gtx 580 for sli which then again is pointless cause the price drop will make a GTX 480 sli set up more affordable.

    So far no news on any re-engineering of this better late than never 512 Cuda core card so It's safe to say it will require it's own Nuke Reactor to power and create more heat than the Sun. GTX 580 SLI "have fun buying that 2500 watt PSU".

    GTX 580 the Hummer of video cards.
     
  20. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    You are only allowed to speculate that it will be awesome and that ATI might as well not launch the 6900 series now. :thumb:
     
Tags: Add Tags

Share This Page