1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware EVGA GeForce GTX 650 1GB review

Discussion in 'Article Discussion' started by brumgrunt, 4 Oct 2012.

  1. Noob?

    Noob? What's a Dremel?

    Joined:
    18 Oct 2009
    Posts:
    3,349
    Likes Received:
    159
    Would agree with you on this mate.
     
  2. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    People complain about personalities and policies at [H]OCP, but I like the fact that their GPU reviews look at the gaming experience and they find the highest playable settings that a card can play a game at. Perhaps card A can only enable Bx AA and C texture quality at D resolution, what can the alternatives do?

    Having said that, I wouldn't like all gaming sites to start doing their reviews the same way. I like hard numbers and easily digested graphics as well. Having both perspectives gives me a better understanding of how a card performs and what I could expect with a purchase.
     
  3. Kodongo

    Kodongo What's a Dremel?

    Joined:
    29 Feb 2012
    Posts:
    93
    Likes Received:
    4
    You clearly haven't seen the XFX GT 640 2GB.

    [​IMG]

    Double dissipation and double slot cooler to handle the vast amount of heat coming off of that goliath GPU as well as cooling the 2GB of memory being pushed to its absolute limit.

    :duh::duh::duh:
     
  4. GuilleAcoustic

    GuilleAcoustic Ook ? Ook !

    Joined:
    26 Nov 2010
    Posts:
    3,277
    Likes Received:
    72
    It is a low profile card. the dual slot / dual fan makes sense if you do not want a noisy quick spinning 40mm fan.
     
  5. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,446
    Likes Received:
    5,850
  6. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    The first thing I thought: AWWWW SO CUTE! But then again, it's made for very low power usage. Hell, they managed to fit an entire mobile version with hardly any gimps into laptops.

    This at the very least shows a very good integration between the chips.They are all Kepler derived, and to me that's impressive. Even AMD needed three lines of chips, this is just basically cutting the same chip..repeatedly.
     
  7. fdbh96

    fdbh96 What's a Dremel?

    Joined:
    29 May 2011
    Posts:
    1,894
    Likes Received:
    33
    This card is obviously not meant for the biggest games at the highest settings, so testing it on bf3 on ultra settings is a bit pointless. No one is going to buy the card and use it like that. Include those tests as well if you like but include at least one benchmark in which the game is playable.
     
  8. Paulg1971

    Paulg1971 Minimodder

    Joined:
    24 Apr 2009
    Posts:
    110
    Likes Received:
    0
    I have a 20 inch screen which I am happy with and play at 1600 x 900, with this card being aimed at the lower end of the market why do you go with such big resoulotions, in my opinion you should use lower res options on cards like these and have 1920x1080 as the highest option.
     
  9. toolio20

    toolio20 What's a Dremel?

    Joined:
    29 Jan 2011
    Posts:
    59
    Likes Received:
    1
    Awful. Just awful.

    Pretty sure this is intentional, as if Nvidia only really wants you to buy the GTX 670 with it's fat and healthy profit margin and so they're just kind of phoning in the rest of their lineup.

    And perhaps rightfully so. I mean, this card appeals to NO ONE (except a few errant BTers, apparently). Genuine gamers won't even give this a look, and budget/low-end folk who want a casual experience are better off just sticking with IGP (e.g. after RMAing a GPU I was stuck with no card for a while and found out the Batman: Arkham games run pretty smooth at high settings 720p with a lone i5 2500k - sort of a shocker).

    If AMD would unfutz their wretched CCC/driver software and quit limiting the OC voltage on their cards they'd get some serious market share gains...too bad that'll never happen.
     
  10. leexgx

    leexgx CPC hang out zone (i Fix pcs i do )

    Joined:
    28 Jun 2006
    Posts:
    1,356
    Likes Received:
    8
    When that card came out I was like “lol“ size box overkill (2gb as well)

    650 not to good but should ok as long settings are not high
     
  11. Narishma

    Narishma What's a Dremel?

    Joined:
    21 Feb 2008
    Posts:
    134
    Likes Received:
    0
    I agree with the others. It's pointless to review such a card using very high resolutions and ultra settings...
     
  12. kirk46

    kirk46 Cheesecake Nom Nom

    Joined:
    9 Jun 2012
    Posts:
    1,263
    Likes Received:
    31
    it would be nice for you to test folding performance of GPU's you review :)

    the folding section seems to get over looked these days :(
     
  13. Baz

    Baz I work for Corsair

    Joined:
    13 Jan 2005
    Posts:
    1,810
    Likes Received:
    92
    Hi Guys

    Re: testing methodology, we have a choice whether to do apples to apples, as we've done so, or apples to oranges (best playable). The latter takes a great deal longer especially when you re-test a truck load of games on an all-new test rig as we have, and provides less information regarding comparisons between the high and low end.

    As such, we have a unified test methodology for GPUs; each has to tackle the same games, so we can fairly compare them. of course if you're willing to dial down settings then any game can run smooth at 1,920 , but that's not really the point of buying a new GPU. I know if i spent £100 on a new GPU, I'd expect it to play most of my games to a half decent level. Defending this card for being OK if you dial the settings down is like saying that the High end cards are pointless - why not just buy a mid-range card and dial the settings down? Why bother with SLI or CrossFire, just dial the settings down. it completely contradicts the push for performance and quality that bit-tech's ethos is all about.

    I know it's silly to include the three-screen numbers, we kind of did it as part of the process, but this card CAN play at 3-screen; surely the revelation that it cant hack the frame rates is a useful conclusion (albeit an obvious one).
     
    Zurechial likes this.
  14. Kodongo

    Kodongo What's a Dremel?

    Joined:
    29 Feb 2012
    Posts:
    93
    Likes Received:
    4
    Kepler is the name of this generation of nVidia chips which include:
    GK104 - GTX 690, GTX 680, GTX 670, GTX660Ti
    GK106 - GTX 660
    GK107 - GTX 650, some GTX 640s, some GTX 630s
    GF108 - some GTX 640s, some GTX 630s

    AMD has gained much more in their drivers than nVidia this generation. CCC 12.7 helped to close the gap between the 7970 and 680 and propelled the 7970GE above the 680.

    Also, most Radeons are able to be overclocked to at least 1.3V in software. Conversely, nVidia has not allowed overvolting on their Kepler cards going as far as to stop companies selling certain SKUs, they have forced eVGA to stop selling their evBot and forced MSi to lock down voltage on their Power Edition cards.
    nVidia Says No to Voltage Control

    What do you think of nVidia locking down voltage?
     
  15. blackworx

    blackworx Cable Wrangler

    Joined:
    31 Jan 2008
    Posts:
    77
    Likes Received:
    2
    I understand and agree with the need for a unified methodology and comparable results, but I'm not sure that that's the logical conclusion of the "apples to oranges" argument. The ability to answer the "we all know it can't do that, but what can it do?" question is still useful.
     
  16. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    That same complaint could be made about hundreds of components over the years. Simple reason is that best isn't always the one that sells. For reasons such as brand loyalty, name recognition, or simple ignorance people will still end up buying components which have strictly superior alternatives. Nvidia (or any company releasing such a product) are surely aware of this and know they'll still likely make money off of this card if they can advertise and make the card available enough.
     
  17. VaLkyR-Assassin

    VaLkyR-Assassin Minimodder

    Joined:
    16 Feb 2009
    Posts:
    100
    Likes Received:
    0
    I like the size of the card, and the performance would be great for my needs - I need to replace a recently deceased 7900GS in an old PC that is only used for playing older LAN games (CoD 4 is the newest game played!) The only issue this card has is price - if it were around £60, I'd snap it up. There is a market for these cards where power is not required for replacing dead cards, just not at these prices. I'd happily pay over £200 for a new card in my main PC of course. For now, the ATi 6670 offers the best price/performance in the price range I'm looking at. I do love the design of that EVGA card though.... :p
     
  18. ssj12

    ssj12 Minimodder

    Joined:
    12 Sep 2007
    Posts:
    689
    Likes Received:
    3
    You do know that there are still a ton of gamers not playing at 1080p yet right? only 28% of steam users report using 1080p. And barely anyone uses higher for single monitor. And I have no idea how many people use multi-monitor, but I doubt half as many as single monitor.

    http://store.steampowered.com/hwsurvey/
     
  19. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    It would look really odd for nVidia to not release a card they had previously announced, even if it is overpriced and uncompetitive. Not that this matters too much, it is important to have something at the price-point than nothing. By having something it muddies any purchasing decision and prompts customers to price creep up to a GTX660.

    To be frank, the manner in which nVidia has released the card (it just arrives without much attention being drawn to it, overshadowed by the heavily pushed GTX660 cards), combined with the lack of distinctive and custom SKUs available or publicly announced and the uncompetitive pricing, tells me that neither nVidia nor their partners care very much about putting an effort into the GTX650. Most will probably end up in some OEM "Gaming" machine.

    Never underestimate the irrationality of slavish devotion to a brand and the successful application of marketing to overcome a testable reality.
    The reality is that this card is overpriced for the performance offered, and there are also no attractive or distinct SKUs. All the cards available or publicly planned (and I've been watching) are essentially the same reference board, dual slot cooling and external power connector, meaning there is little reason to choose a GTX650 over a higher performing HD7770 for the same price or cheaper.

    And if anyone trots out the tired old canard of "Better nVidia drivers" I think I'll have to scream, rip off my clothes and run down the street waving a tired 9800GT in naked annoyance and frustration.
     
  20. fdbh96

    fdbh96 What's a Dremel?

    Joined:
    29 May 2011
    Posts:
    1,894
    Likes Received:
    33
    I wasn't saying to test all the cards at a lower setting. But maybe in this example, the 7770 vs the 650 on medium settings for example. It would only need to be in one game really.
     
Tags: Add Tags

Share This Page