The card only has a 6-pin and 8-pin power connectors, so I doubt it's a dual-GPU card. I'm more inclined to think either 670 or entry level (such as 640). Rumour is that beow 640 will all be rebrands, so they wouldn't be newsworthy.
GTX 620 is a rebrand 520. Might be slightly overclocked, just to say it's different. This is more for OEM's, or people seeking better multi-monitor support over Intel offering. In term of performance Ivy Bridge is about similar performance as the GTX 520, a bit faster than GTX 520 in FPS. But, if anyone used Intel GPU in gaming they know that: -> Intel has poor driver support (basically when a new CPU comes out, no more update on older drivers, beside some major bug) -> Intel doesn't draw stuff in games, while an Nvidia and AMD draws everything for the same game settings -> Intel offering doesn't show effect nearly as good as Nvidia and AMD GPU's. Basically, everything is cut and driver optimized for high FPS scores for CURRENT games, and later games.. well expect to not work, or be a slide show, or have gaming graphical glitches. So Nvidia has still a reason to sale it, no need upgrade it (which costs money to design, and the market aimed at doesn't see performance, 'cause, let's be honest, no gamers would buy or recommend this GPU), this is really a GPU (compared to Intel offering) for people want a full proper GPU experience, proper multi-monitor support, full applications/game support, good and continuous driver support, and CUDA/OpenCL support. The rename model number is probably just to please OEM's.
There's a lot of scuttlebutt about the 680 was originally supposed to be the 670TI, and that the REAL 680 was an even more massive performer. I kinda hope so.
Well, it's fairly obvious when the NVIDIA DRIVERS detects the GTX 680 to GTX 670 Ti. AND, for a first time on a "#80" model, the circuit board is missing components, as you can see here: http://images.bit-tech.net/content_images/2012/03/nvidia-geforce-gtx-680-2gb-review/gtx680-7b.jpg Also, that the GK104 and GK110 models where also highly rumors. Turns out that the GK104 was true with the latest rumors, and no sign of the GK110. I think it's pretty obvious what Nvidia did at the last minute. And again, why wouldn't they. It already beats AMD best single GPU offering, and no (well ok maybe less than a hand full) games actually can push the GTX 680 to it's max at the popular 1080p resolution. And most PC games are poorly done console ports.
I love reading all the speculation. I hope whatever is released doesn't dissapoint. All I know is I won't be buying GTX 600 series simply as I don't need/can't afford too.
I imagine it doesn't make much of a difference to airflow, most coolers don't have much airflow in that area. I wonder how it affects heat though... it seems like heat would be more concentrated with power more concentrated, but maybe the plug assembly and PCB are so efficient that it makes no difference. Also, even if it is hotter, maybe it helps to cool the rest of the card down to have that heat isolated to the power plug region.
They are no resistors there (or anything that acts like a resistor), so no heat. Also if you look closely at the board design and when the heatsink is on, the fan is higher up, and not centered like previous card. This also helps reduce the problem of having the DVI plugs stacked block air flow. Check this video to know what I mean: See at 4min 7sec
As long as it's not GK110 (GTX685 or what ever they will call it), otherwise, I will be sending my 2X GTX 680 4GB back... (Doubt it very much, otherwise, they will P***off a lot of people, and cause world wide riots!) Lol.
Mmmmm GTX685/GTX780 would certainly be tasty but very unlikely for a near future release. My hope is GTX670TI and GTX660TI models. I really want to stikc something much faster and cooler running in my wife's machine as the HD4870 was only bought as a backup card. The HD78xx are tempting but the GTX67x and 66x should hopefully drive a reduction in prices.
Well if they release something that isn't a dual gpu card, and is faster than the current GTX 680's I will seriously be annoyed. I will then sell my GTX 680 and go over to AMD and get a 7970, and never buy another Nvidia card again. I think it might turn out to be the GTX 670 or GTX 690.
I thought the whole point of having two plugs was to spread the electrical load over more pins because the pins begin to act as resistors when nearing maximum current and produce heat. Might be that this is just a danger to the socket and plug plastics, and that the heat generated at that junction is isolated there and thus irrelevant to card temps.
Nope, it just follows the specs of the lowest common denominator of the PSUs. Some PSU (including Antec's) have multiple rails systems, so the PCI-E power plug can't send more power. And not 1 rail system, so you can pull as much as you want.. well almost. It's a design limitation (multiple rails). Antec (and others doing this) claim to provide increase stability. Personally, I don't see the difference, when comparing to other high end PSU's. So it has nothing to do with wires, or plug heat or current limitation, it's more for protecting users against poor quality PSU's or poor quality cables used to connect hardware, and also increase support of PSU's.