1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Most suitable GPU config for gaming @ 2,560 x 1,600

Discussion in 'Hardware' started by dead beat, 21 Dec 2010.

  1. Teelzebub

    Teelzebub Up yours GOD,Whats best served cold

    Joined:
    27 Nov 2009
    Posts:
    15,796
    Likes Received:
    4,484
     
    Last edited: 23 Dec 2010
  2. JaredC01

    JaredC01 Hardware Nut

    Joined:
    24 Nov 2002
    Posts:
    1,259
    Likes Received:
    62
    Not sure if it's just an AMD issue then, because I've had microstutter since I installed the second 5870. With any luck my GTX 580 should be showing up today (it's scheduled for today, though it's still in route to Alaska from Cali so I'm not getting my hopes up).

    I will say that microstutter doesn't seem to be an issue if the framerate in game is equal to the refresh rate of the monitor. Basically, I have the option to remove all AA, which is why I have the second card in the first place, or deal with the microstutter... Back to a single card for me. :)
     
  3. Yslen

    Yslen Lord of the Twenty-Seventh Circle

    Joined:
    3 Mar 2010
    Posts:
    1,966
    Likes Received:
    48
    Erm... a 6870 crossfire setup walks all over a single 580.

    Results from Techpowerup, GTX 580 vs 6870x2, in fps;

    BFBC2: 75.9 vs 100.4
    Crysis: 38.5 vs 59.0
    Metro 2033: 24.7 vs 37.1
    Stalker Clear Sky: 58.9 vs 70.6
    DIRT2: 104.3 vs 150.7
    Unigene Heaven: 46.8 vs 55.6

    Plus two 6870s are cheaper than a single GTX 580.

    EDIT: Looking at the Hexus review even crossfire 6850s or 5850s beat a single 580.
     
    Last edited: 24 Dec 2010
  4. JaredC01

    JaredC01 Hardware Nut

    Joined:
    24 Nov 2002
    Posts:
    1,259
    Likes Received:
    62
    Yeah I did a once over again after I posted that and noticed the same thing, though after testing my 5870 crossfire setup in multiple benches, then putting the 580 in today and running the same tests, the 580 was consistently faster in most of the tests. Plus, the microstutter is gone.

    Here's a thread I posted in over at X-S with a Stalker comparison...

    http://www.xtremesystems.org/forums/showthread.php?t=263806

    I don't have pics of it, but Heaven produced similar differences.

    Also, for the OP, did a few tests today... Fallout: New Vegas maxed out details with max AF and 4x AA resulted in 60+ FPS consistently in the short time I was testing it. Similar results with other games as well, very playable with maxed details and decent AA.
     
  5. Yslen

    Yslen Lord of the Twenty-Seventh Circle

    Joined:
    3 Mar 2010
    Posts:
    1,966
    Likes Received:
    48
    I'd be worried if you couldn't get a solid 60fps from FO:NV with a GTX 580; it's the same engine as oblivion and I could get 60fps from that maxed out on my old 4850. I play Oblivion at the moment with ultra-high res textures (modded) and 8xAA and get 60fps from my 6870.

    STALKER and the heaven benchmark are probably the two best tests to make a 580 look great against crossfire 5870s. The former needs LOTS of vram at that resolution, so the two 1GB ATI cards fall behind. The latter is very very tessellation heavy, so the far superior tessellation units in the NVidia card again give it the edge.

    In the vast vast majority of games (pretty much anything except Metro 2033) the crossfire setup is considerably better.

    Still, both options give you great frame rates and the single card has the advantage of upgradability, no micro-stutter and physX support, so if you can afford it (I sure can't!) it's a great choice!

    If the OP can afford a 580 it's the obvious way to go, though there's certainly no reason you couldn't game with a much cheaper card. Even a 6850 will get you perfectly playable framerates in games like Battlefield BC2 at 2560x1600.
     
    JaredC01 likes this.
  6. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    714
    dual-GTX580....MOAR POWAH!

    single one should do, but i think you should go for dual gtx570 route as its memory is will be a limiting factor, try two gtx480 if your gudget doesn't allow two gtx580.
     
  7. JaredC01

    JaredC01 Hardware Nut

    Joined:
    24 Nov 2002
    Posts:
    1,259
    Likes Received:
    62
    The 580 will be fine, and anything in windowed mode will benefit from the added power in the single card (one of the main reasons I switched from twin 5870's), while everything in full screen will still be perfectly playable for quite some time to come.

    Plus, going with a single 580 now leaves room to add a second card down the road if necessary... Starting with a twin card setup, there's nowhere to go to (that makes sense anyway, triple scaling and up is still a less than ideal solution).
     
  8. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    Well I've bought the 30" monitor. I went with the Dell 3007WFP-HC. So I guess I'll just have to see how my 295 holds up for the time being. I expect it'll be replaced with a 580 at some point, but that all depends.
     
  9. xXSebaSXx

    xXSebaSXx Minimodder

    Joined:
    21 Aug 2010
    Posts:
    496
    Likes Received:
    45
    Your 295 is going to jump out of the case and go hide in a closet somewhere once you make it feed that 30" beast for a nice gaming session. Be prepared to go shopping soon!!
    :)
     
  10. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    Yeh I was expecting this to be the case. Next month I'll swap the 295 for a 580. Having said that though, I still think most of my games will be perfectly playable for the time being.
     
  11. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,255
    Likes Received:
    1,822
    Don't be so quick to dismiss the 295 - it has roughly the same power as a 480 and the 580 isn't THAT much of an improvement.

    The 580 will be a lot quieter though!
     
  12. Ph4ZeD

    Ph4ZeD What's a Dremel?

    Joined:
    22 Jul 2009
    Posts:
    3,806
    Likes Received:
    143
    I can't see the 295, with its low VRAM, being able to feed a 30" screen at high settings.
     
  13. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    Which is why I will be upgrading to a 580, with a view to going down the SLI route. 2 x 580's should cope.
     
  14. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,255
    Likes Received:
    1,822
    I could run games like Mass Effect, Bioshock and Fallout 3 quite happily on an 8800GTX with its 'measly' 768MB of memory - antialiasing was out of the question though.
    I do think the 580 is a damn good card, but IMHO you should either only have a single one or go flat out and get three - the scaling for two can be rather atrocious in my experience. This may improve with new drivers however.
     
  15. Ph4ZeD

    Ph4ZeD What's a Dremel?

    Joined:
    22 Jul 2009
    Posts:
    3,806
    Likes Received:
    143
    The scaling for dual card SLI is excellent - 95%+ at high resolutions.

    http://www.guru3d.com/article/geforce-gtx-580-sli-review/6 for reference.
     
  16. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    Yeh i think a pair of the 580's should scale pretty well. I know my AMD platform is not as great for 30" gaming as an intel platform would be, but I've been fairly happy with how the processor has managed so far.
     
  17. JaredC01

    JaredC01 Hardware Nut

    Joined:
    24 Nov 2002
    Posts:
    1,259
    Likes Received:
    62
    Gotta agree with Pete. The single 580 in my system right now can take anything I can throw at it.

    Bad Company completely maxed out details with 4x AA never drops below 60FPS (V-Sync is on so it's capped), and even with 16xQA the game is playable (though it does hover around 40 FPS average). Batman AA with maxed everything (PhysX, AA, AF, etc) averages 50FPS. Fallout: NV completely maxed never drops below 60FPS V-Sync. Etc.

    Basically, you should be able to play every game out there right now with max details, though you MIGHT have to turn down the AA in some of the games to keep a good framerate.

    Oh, and when you get the 580, overclock it! Currently running 875MHz Core / 1750MHz Shader / 2200MHz (4400 effective) RAM at 1.075v without any hiccups. Runs fine on stock cooler as well. Awesome card. :)

    Edit: Oh, and I haven't pushed the card any further than 875 yet due to cooling. Max I'm hitting right now is about 75*C with a custom fan profile in MSI Afterburner, but I don't want to have to run the fan at full speed (85%) to keep the card cool, so I'm holding off for better cooling. The card itself should be capable of pushing 950MHz without breaking a sweat, so long as proper cooling is in order.
     
    Last edited: 27 Dec 2010
  18. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    Yeh I've pretty much subscribed myself to the fact that i'm gonna get a 580. But my only concern is that the AMD platform may hold the system back when gaming at this resolution.
     
  19. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    714
    high resolution shouldn't matter, especially if it's 2560 we are talking about.

    on the quiet novatech forums, there's a guy running phenom 1 and he didn't have any performance increase from gtx275 to gtx570 he plugged in. but he was playing at 1680x1050.


    @JaredC01, i won't be so sure about getting 950 without trying it. my one sits comfortably at 900MHz @1.1v. but even at maximum 1.138v, i can't get it to run through 3dmark11 at 950MHz :(
     
  20. dead beat

    dead beat Rippin six 4 life

    Joined:
    15 Feb 2009
    Posts:
    1,543
    Likes Received:
    48
    So you're of the opinion that my processor should be able to cope with the demands placed on it by the larger monitor?

    I guess that makes sense as it's the graphics power that will need to increase as opposed to the cpu power really.

    Or am I wrong?
     

Share This Page