Discussion in 'Article Discussion' started by brumgrunt, 4 Oct 2012.
Would agree with you on this mate.
People complain about personalities and policies at [H]OCP, but I like the fact that their GPU reviews look at the gaming experience and they find the highest playable settings that a card can play a game at. Perhaps card A can only enable Bx AA and C texture quality at D resolution, what can the alternatives do?
Having said that, I wouldn't like all gaming sites to start doing their reviews the same way. I like hard numbers and easily digested graphics as well. Having both perspectives gives me a better understanding of how a card performs and what I could expect with a purchase.
You clearly haven't seen the XFX GT 640 2GB.
Double dissipation and double slot cooler to handle the vast amount of heat coming off of that goliath GPU as well as cooling the 2GB of memory being pushed to its absolute limit.
It is a low profile card. the dual slot / dual fan makes sense if you do not want a noisy quick spinning 40mm fan.
Is that a real card or a mock up?
 holy Sh**, it's real.
The first thing I thought: AWWWW SO CUTE! But then again, it's made for very low power usage. Hell, they managed to fit an entire mobile version with hardly any gimps into laptops.
This at the very least shows a very good integration between the chips.They are all Kepler derived, and to me that's impressive. Even AMD needed three lines of chips, this is just basically cutting the same chip..repeatedly.
This card is obviously not meant for the biggest games at the highest settings, so testing it on bf3 on ultra settings is a bit pointless. No one is going to buy the card and use it like that. Include those tests as well if you like but include at least one benchmark in which the game is playable.
I have a 20 inch screen which I am happy with and play at 1600 x 900, with this card being aimed at the lower end of the market why do you go with such big resoulotions, in my opinion you should use lower res options on cards like these and have 1920x1080 as the highest option.
Awful. Just awful.
Pretty sure this is intentional, as if Nvidia only really wants you to buy the GTX 670 with it's fat and healthy profit margin and so they're just kind of phoning in the rest of their lineup.
And perhaps rightfully so. I mean, this card appeals to NO ONE (except a few errant BTers, apparently). Genuine gamers won't even give this a look, and budget/low-end folk who want a casual experience are better off just sticking with IGP (e.g. after RMAing a GPU I was stuck with no card for a while and found out the Batman: Arkham games run pretty smooth at high settings 720p with a lone i5 2500k - sort of a shocker).
If AMD would unfutz their wretched CCC/driver software and quit limiting the OC voltage on their cards they'd get some serious market share gains...too bad that'll never happen.
When that card came out I was like “lol“ size box overkill (2gb as well)
650 not to good but should ok as long settings are not high
I agree with the others. It's pointless to review such a card using very high resolutions and ultra settings...
it would be nice for you to test folding performance of GPU's you review
the folding section seems to get over looked these days
Re: testing methodology, we have a choice whether to do apples to apples, as we've done so, or apples to oranges (best playable). The latter takes a great deal longer especially when you re-test a truck load of games on an all-new test rig as we have, and provides less information regarding comparisons between the high and low end.
As such, we have a unified test methodology for GPUs; each has to tackle the same games, so we can fairly compare them. of course if you're willing to dial down settings then any game can run smooth at 1,920 , but that's not really the point of buying a new GPU. I know if i spent £100 on a new GPU, I'd expect it to play most of my games to a half decent level. Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless - why not just buy a mid-range card and dial the settings down? Why bother with SLI or CrossFire, just dial the settings down. it completely contradicts the push for performance and quality that bit-tech's ethos is all about.
I know it's silly to include the three-screen numbers, we kind of did it as part of the process, but this card CAN play at 3-screen; surely the revelation that it cant hack the frame rates is a useful conclusion (albeit an obvious one).
Kepler is the name of this generation of nVidia chips which include:
GK104 - GTX 690, GTX 680, GTX 670, GTX660Ti
GK106 - GTX 660
GK107 - GTX 650, some GTX 640s, some GTX 630s
GF108 - some GTX 640s, some GTX 630s
AMD has gained much more in their drivers than nVidia this generation. CCC 12.7 helped to close the gap between the 7970 and 680 and propelled the 7970GE above the 680.
Also, most Radeons are able to be overclocked to at least 1.3V in software. Conversely, nVidia has not allowed overvolting on their Kepler cards going as far as to stop companies selling certain SKUs, they have forced eVGA to stop selling their evBot and forced MSi to lock down voltage on their Power Edition cards.
nVidia Says No to Voltage Control
What do you think of nVidia locking down voltage?
I understand and agree with the need for a unified methodology and comparable results, but I'm not sure that that's the logical conclusion of the "apples to oranges" argument. The ability to answer the "we all know it can't do that, but what can it do?" question is still useful.
That same complaint could be made about hundreds of components over the years. Simple reason is that best isn't always the one that sells. For reasons such as brand loyalty, name recognition, or simple ignorance people will still end up buying components which have strictly superior alternatives. Nvidia (or any company releasing such a product) are surely aware of this and know they'll still likely make money off of this card if they can advertise and make the card available enough.
I like the size of the card, and the performance would be great for my needs - I need to replace a recently deceased 7900GS in an old PC that is only used for playing older LAN games (CoD 4 is the newest game played!) The only issue this card has is price - if it were around £60, I'd snap it up. There is a market for these cards where power is not required for replacing dead cards, just not at these prices. I'd happily pay over £200 for a new card in my main PC of course. For now, the ATi 6670 offers the best price/performance in the price range I'm looking at. I do love the design of that EVGA card though....
You do know that there are still a ton of gamers not playing at 1080p yet right? only 28% of steam users report using 1080p. And barely anyone uses higher for single monitor. And I have no idea how many people use multi-monitor, but I doubt half as many as single monitor.
It would look really odd for nVidia to not release a card they had previously announced, even if it is overpriced and uncompetitive. Not that this matters too much, it is important to have something at the price-point than nothing. By having something it muddies any purchasing decision and prompts customers to price creep up to a GTX660.
To be frank, the manner in which nVidia has released the card (it just arrives without much attention being drawn to it, overshadowed by the heavily pushed GTX660 cards), combined with the lack of distinctive and custom SKUs available or publicly announced and the uncompetitive pricing, tells me that neither nVidia nor their partners care very much about putting an effort into the GTX650. Most will probably end up in some OEM "Gaming" machine.
Never underestimate the irrationality of slavish devotion to a brand and the successful application of marketing to overcome a testable reality.
The reality is that this card is overpriced for the performance offered, and there are also no attractive or distinct SKUs. All the cards available or publicly planned (and I've been watching) are essentially the same reference board, dual slot cooling and external power connector, meaning there is little reason to choose a GTX650 over a higher performing HD7770 for the same price or cheaper.
And if anyone trots out the tired old canard of "Better nVidia drivers" I think I'll have to scream, rip off my clothes and run down the street waving a tired 9800GT in naked annoyance and frustration.
I wasn't saying to test all the cards at a lower setting. But maybe in this example, the 7770 vs the 650 on medium settings for example. It would only need to be in one game really.
Separate names with a comma.