hardocp said the 680 was a tad faster, highest playable settings were almost identical between the 680 and 7970.
Actually what they said was quite different.... http://www.hardocp.com/article/2012/03/22/nvidia_kepler_gpu_geforce_gtx_680_video_card_review/14 "In fact, we were surprised to find that the GTX 680 not only matched the gameplay experience of the Radeon HD 7970 at 2560x1600, but in most cases the framerates were a bit faster with the GTX 680." "We found out quickly that the GTX 680 could hold its own and sometimes dominate the Radeon HD 7970 at 2560x1600." "While gaming at 2560x1600 was fantastic, we wanted to push the video card to its limits, and so next we configured an NV Surround triple-display configuration and gamed on three displays from the single GeForce GTX 680. We wanted to be able to run at the native resolution of 5760x1200 and compare the performance to the Radeon HD 7970. We figured if any resolution is going to show the advantages of AMD's memory capacity and memory bandwidth edge it would be 5760x1200. We were absolutely surprised that the GeForce GTX 680 had no trouble keeping pace with the Radeon HD 7970 at 5760x1200. We thought this is the resolution we might see the GTX 680 bottleneck, but to our surprise no bottlenecks were experienced. " "After that we tested at 1920x1200. The GeForce GTX 680 is a beast at 1920x1200 or 1080p resolutions. In all of our gaming at 1920x1200 we found the GeForce GTX 680 to be consistently faster than the Radeon HD 7970. The GeForce GTX 680 remained faster than the Radeon HD 7970 even when we turned on 4X or 8X MSAA in games. At 1920x1200 (or 1080p) the GeForce GTX 680 is faster than the Radeon HD 7970 and provides a better gameplay experience." "NVIDIA has also surprised us by providing an efficient GPU. Efficiency is out of character for NVIDIA, but surely we welcome this cool running and quiet GTX 680. NVIDIA has delivered better performance than the Radeon HD 7970 with a TDP and power envelope well below that of the Radeon HD 7970. NVIDIA has made a huge leap in efficiency, and a very large step up from what we saw with the GeForce GTX 580. "
I generally rate OC3D and tinytomlogan's reviews quite highly, especially when it comes to cases. He does seem to be the only one I can find who states the cards are pretty much identical. I'll need to have an in-depth look at his and bit-tech's benchmarks to see what is the issue. Hmm, first thing is that he did test more games. The only two titles that overlap are Dirt 3 and BF3, but the benchmark for Dirt3 apparently had some issues. The BF3 test on OC3D is done with higher AA settings than bit-tech one. Might this be the equalizer? Bit-tech ran their benches on a mildly overclocked i2500 while OC3D ran theirs on a 3960X clocked at 4.6 GHz. I don't think that the i2500 can make game fps CPU limited, but I'm not sure. OC3D tested using the ForceWare 300.99 drivers while bit-tech just states "release driver". Does it mean they are the same? Apparently ttl got some pretty wonky results in Dirt3 so maybe they are using different drivers. Hmm...
2 interesting article i just read, 3 interesting information: 2 GHz 680 coming?! http://www.techpowerup.com/162935/ZOTAC-Working-On-GeForce-GTX-680-with-2-GHz-Core-Clock-Speed.html was originally going to be called 670 Ti http://www.techpowerup.com/162901/Did-NVIDIA-Originally-Intend-to-Call-GTX-680-as-GTX-670-Ti-.html and from that article, this screenshot interest me. triple monitor done right, no more switching to Single-screen-mode for gaming and switching back for normal desktop. nVidia's tri-display solution can have Taskbar always at center display and restrict window maximise to single screen. also, does that diagram mean nVidia can drive 3 displays without need to use displayport? (eg. 2x DVI and 3rd monitor on HDMI for nV Surround)
I'll believe in that 2GHz 680 when it is out in retail. Sounds completely bonkers. And from the second article, and especially those diagrams it does look like Nvidia pushed the GPU up one tier fairly recently. Either because AMD didn't pull very far ahead with it's 79xx range and wanted to get a higher profit margin with a smaller GPU; or because they had yield issues with the full fat one. Might be both.
i'd expect the 2GHz is when it maximum boosts to 2GHz, can't see any chip run faster than 50% of its stock speed to be honest. found more info on this surround business: http://www.geforce.com/whats-new/articles/nvidia-surround-on-the-geforce-gtx-680/#2 so yes, you don't need to use Displayport to play NV surround. all you need is 3 cheap DVI monitors. (and HDMI to DVI cable, they are identical signal type)
The issue with 2GHz is that it would be completely starved out because of the memory bandwidth. 6GHz you see now on GDDR5 modules is the real maximum you can get with a good enough yield, and that is just barely enough for the 7970/680 as it is. While the GPU could go higher, you have nothing to feed him at that frequency. OT: &$#@@ camera battery died out right when i got home with my GTX680 ...
2Hhz seems rather unlikely on air: http://lab501.ro/placi-video/nvidia-geforce-gtx680-partea-ii-studiu-de-overclocking/4 http://kingpincooling.com/forum/showthread.php?t=1681 1400ish on air, and almost 1900 on LN2.
From what I see, he turned the GK104 (Gtx 680) to the what it looks like now GTX 700 series (GK1x0 (not Maxwell)) by adding the missing component on the board, and then overclock like a mad scientist the poor little GPU.
Yes. I am doing so - nVidia Surround on GTX680 with one Dell U2410 via DVI and two Dell 2405s via DVI and HDMI (HDMI->DVI cable). Will be borrowing a DisplayPort capable monitor to test the Surround+1 setup when I get the chance.