1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Toms Hardware 680 Review leaked

Discussion in 'Hardware' started by Mongoose132, 20 Mar 2012.

  1. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
  2. feathers

    feathers Minimodder

    Joined:
    11 Apr 2009
    Posts:
    2,535
    Likes Received:
    59
    Actually what they said was quite different....

    http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/14

    "In fact, we were surprised to find that the GTX 680 not only matched the gameplay experience of the Radeon HD 7970 at 2560x1600, but in most cases the framerates were a bit faster with the GTX 680."

    "We found out quickly that the GTX 680 could hold its own and sometimes dominate the Radeon HD 7970 at 2560x1600."

    "While gaming at 2560x1600 was fantastic, we wanted to push the video card to its limits, and so next we configured an NV Surround triple-display configuration and gamed on three displays from the single GeForce GTX 680. We wanted to be able to run at the native resolution of 5760x1200 and compare the performance to the Radeon HD 7970. We figured if any resolution is going to show the advantages of AMD's memory capacity and memory bandwidth edge it would be 5760x1200. We were absolutely surprised that the GeForce GTX 680 had no trouble keeping pace with the Radeon HD 7970 at 5760x1200. We thought this is the resolution we might see the GTX 680 bottleneck, but to our surprise no bottlenecks were experienced. "

    "After that we tested at 1920x1200. The GeForce GTX 680 is a beast at 1920x1200 or 1080p resolutions. In all of our gaming at 1920x1200 we found the GeForce GTX 680 to be consistently faster than the Radeon HD 7970. The GeForce GTX 680 remained faster than the Radeon HD 7970 even when we turned on 4X or 8X MSAA in games. At 1920x1200 (or 1080p) the GeForce GTX 680 is faster than the Radeon HD 7970 and provides a better gameplay experience."

    "NVIDIA has also surprised us by providing an efficient GPU. Efficiency is out of character for NVIDIA, but surely we welcome this cool running and quiet GTX 680. NVIDIA has delivered better performance than the Radeon HD 7970 with a TDP and power envelope well below that of the Radeon HD 7970. NVIDIA has made a huge leap in efficiency, and a very large step up from what we saw with the GeForce GTX 580. "
     
  3. urobulos

    urobulos Minimodder

    Joined:
    13 Apr 2010
    Posts:
    358
    Likes Received:
    10
    I generally rate OC3D and tinytomlogan's reviews quite highly, especially when it comes to cases. He does seem to be the only one I can find who states the cards are pretty much identical. I'll need to have an in-depth look at his and bit-tech's benchmarks to see what is the issue.


    Hmm, first thing is that he did test more games. The only two titles that overlap are Dirt 3 and BF3, but the benchmark for Dirt3 apparently had some issues. The BF3 test on OC3D is done with higher AA settings than bit-tech one. Might this be the equalizer? Bit-tech ran their benches on a mildly overclocked i2500 while OC3D ran theirs on a 3960X clocked at 4.6 GHz. I don't think that the i2500 can make game fps CPU limited, but I'm not sure. OC3D tested using the ForceWare 300.99 drivers while bit-tech just states "release driver". Does it mean they are the same? Apparently ttl got some pretty wonky results in Dirt3 so maybe they are using different drivers. Hmm...
     
    Last edited: 23 Mar 2012
  4. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,998
    Likes Received:
    716
    2 interesting article i just read, 3 interesting information:

    2 GHz 680 coming?!
    http://www.techpowerup.com/162935/ZOTAC-Working-On-GeForce-GTX-680-with-2-GHz-Core-Clock-Speed.html

    was originally going to be called 670 Ti
    http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html

    and from that article, this screenshot interest me. triple monitor done right, no more switching to Single-screen-mode for gaming and switching back for normal desktop. nVidia's tri-display solution can have Taskbar always at center display and restrict window maximise to single screen.
    [​IMG]



    also, does that diagram mean nVidia can drive 3 displays without need to use displayport? (eg. 2x DVI and 3rd monitor on HDMI for nV Surround)
     
  5. urobulos

    urobulos Minimodder

    Joined:
    13 Apr 2010
    Posts:
    358
    Likes Received:
    10
    I'll believe in that 2GHz 680 when it is out in retail. Sounds completely bonkers.


    And from the second article, and especially those diagrams it does look like Nvidia pushed the GPU up one tier fairly recently. Either because AMD didn't pull very far ahead with it's 79xx range and wanted to get a higher profit margin with a smaller GPU; or because they had yield issues with the full fat one. Might be both.
     
  6. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,998
    Likes Received:
    716
    i'd expect the 2GHz is when it maximum boosts to 2GHz, can't see any chip run faster than 50% of its stock speed to be honest.


    found more info on this surround business:
    http://www.geforce.com/whats-new/articles/nvidia-surround-on-the-geforce-gtx-680/#2

    so yes, you don't need to use Displayport to play NV surround. all you need is 3 cheap DVI monitors. (and HDMI to DVI cable, they are identical signal type)

    [​IMG]
     
  7. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    The issue with 2GHz is that it would be completely starved out because of the memory bandwidth. 6GHz you see now on GDDR5 modules is the real maximum you can get with a good enough yield, and that is just barely enough for the 7970/680 as it is. While the GPU could go higher, you have nothing to feed him at that frequency.

    OT: &$#@@ camera battery died out right when i got home with my GTX680 :D...
     
  8. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    7,024
    Likes Received:
    565
  9. Guest-44432

    Guest-44432 Guest

    Got to love K|ngp|n and his work.:)
     
  10. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    From what I see, he turned the GK104 (Gtx 680) to the what it looks like now GTX 700 series (GK1x0 (not Maxwell)) by adding the missing component on the board, and then overclock like a mad scientist the poor little GPU.
     
  11. dave_beast

    dave_beast Enthusiast

    Joined:
    4 Jan 2011
    Posts:
    67
    Likes Received:
    1
    The 2ghz card was wrong, it's a dual card with dual 1ghz GK104 cores.
     
  12. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,306
    Likes Received:
    86
    Yes.

    I am doing so - nVidia Surround on GTX680 with one Dell U2410 via DVI and two Dell 2405s via DVI and HDMI (HDMI->DVI cable).

    Will be borrowing a DisplayPort capable monitor to test the Surround+1 setup when I get the chance. :D
     

Share This Page