1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Overclocking Nvidia’s GeForce GTX 460

Discussion in 'Article Discussion' started by Sifter3000, 21 Jul 2010.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
  2. Deact

    Deact New Member

    Joined:
    23 Sep 2009
    Posts:
    54
    Likes Received:
    4
    Interesting article that shows fermi wasn't a complete bust.

    By the way, for the GTX460 768MB you quote two seperate prices, £150 on the graph and £175 in the conclusion.
     
  3. Abdul Hadi

    Abdul Hadi Technically, I wanna be tim!

    Joined:
    6 Aug 2009
    Posts:
    45
    Likes Received:
    0
    I would have to say.................. Wwwwwwwwwooooooooooowwwww.

    But the thermals graph and the power graph would have been a better demonstration.
     
  4. Ph4ZeD

    Ph4ZeD New Member

    Joined:
    22 Jul 2009
    Posts:
    3,806
    Likes Received:
    143
    A bit brief but illustrates the point well - they can overclock like a beast. Quick question, if you have 2 in SLI, how would you overclock them both?
     
  5. Bad_cancer

    Bad_cancer Mauritius? 2nd speck east of africa

    Joined:
    7 Apr 2009
    Posts:
    708
    Likes Received:
    12
    If I had to guess, I would say that the software caters to that kind of thing.
    I use Evga precision and It allows you to Oc both cards simultaneously
     
  6. mull

    mull New Member

    Joined:
    22 Sep 2006
    Posts:
    24
    Likes Received:
    0
    Really glad to see the graphics market finally getting some competition at the £200 mark - these cards look like beasts when overclocked!
     
  7. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    1,976
    Likes Received:
    101
    Thanks for the article guys!

    A small bit of constructive critism if I may:

    Whilst I understand this article was more about overclocking that the actual gameplay experience it would have been useful to at least bench both overclocked cards across your usual spectrum of games.

    This was we could see if the extra performance boost is across the board or limited to specific scenarios.

    Perhaps given the HD5850's drop in price to just under £200 you could maybe put together an article showing when overclocked what all of these cards are capable of. This would answer the question whether a overclocked HD5850 is better than a overclocked GTX460.
     
  8. DbD

    DbD Member

    Joined:
    13 Dec 2007
    Posts:
    477
    Likes Received:
    10
    I almost suspect nvidia set the core clock lower then they needed too to stop it making the existing GTX 465 look stupid.

    Then there's the fact that none of the current cards even use all of the chips 386 shaders - I think nvidia's problem is more that it's so much better then GF100 that it would basically stop sales of the GTX 470 or lower if they'd showed the chips full potential and released a 386 shader card at a 750 clock.
     
  9. impar

    impar Well-Known Member

    Joined:
    24 Nov 2006
    Posts:
    3,099
    Likes Received:
    41
    Greetings!
    Yep.
    Its like writing an article about horses, with just some horses doped and in a 250m race.

    At least the GTX465 and the HD5850 (HD5770 too?) should be overclocked and the resolution should be the current standard 1920x1080.
    1680x1050 is too 2006.
     
  10. Pete J

    Pete J RIP Teelzebub

    Joined:
    28 Sep 2009
    Posts:
    5,310
    Likes Received:
    315
    If you use Afterburner, the overclock will be applied to both cards by default.
     
  11. Ph4ZeD

    Ph4ZeD New Member

    Joined:
    22 Jul 2009
    Posts:
    3,806
    Likes Received:
    143
    Cheers Pete. I'm thinking of ditching my 260s for a 460 SLI set up and I think I'd be a fool not to overclock them on the basis of the article.
     
  12. kenco_uk

    kenco_uk I unsuccessfully then tried again

    Joined:
    28 Nov 2003
    Posts:
    9,696
    Likes Received:
    308
    Aye, I'm overclocking mine atm. Using Furmark, I thought that'd mean it was stable - I also ran the Crysis benchmark tool and that ran fine. Things didn't go so well when running the Just Cause 2 benchmark built into the game. I went from 880core down to 850core just so that Just Cause 2 didn't crash to a 'Close Program' dialogue window - at least it didn't 'reset' the drivers, which usually requires a reboot, meaning I could quickly determine how far I could push JC2. I think it's stuck at 850core atm, no amount of voltage faffing causes it to suddenly be stable in JC2 much above 850core. Strange how I could get it to 880core with Furmark with no artifacts or crashing.. I did try it at 890core and whilst having a few windows open and downloading a file, I ran the Unigine benchmark, whereupon it bsod'd and hosed my OS.

    One thing I found, using MSI Afterburner, I can only set the fan manually to a maximum of 70%, anything above that and it reset to 40%. So I have it sat on 70% which doesn't seem too loud and it seems to keep the card cool enough, around 60c-ish.

    Another thing I noticed, I had to rename the Just Cause 2 file as the NVidia driver was picking it up and applying settings automatically through the driver settings, causing it to crash within seconds.
     
  13. Hustler

    Hustler Well-Known Member

    Joined:
    8 Aug 2005
    Posts:
    1,009
    Likes Received:
    30
    Despite the recommendation to go for the 1Gb version, we are still only talking about a 2fps difference on both min/max frame rates.....can you really notice that whilst playing??

    I only game at 1280x1024, so for me its the 768mb version...

    I have yet to see any game use than more 512mb of gfx card ram with 4xAA at the resoultion i play at......and im confident that with the exception of GTAIV, which i dont even own, nor will i ever need more than a 512mb gfx card....

    Oh, the price of these cards, especially the 1GB versions, seems to be creeping upwards as well....dont you just love rip off retailers....
     
  14. Ljs

    Ljs Well-Known Member

    Joined:
    4 Sep 2009
    Posts:
    2,217
    Likes Received:
    113
    While I do agree to some extent, I think its about future-proofing a little bit for the sake of £20 - some people only upgrade graphics cards every 3/4 years and who knows what can happen in that time!
     
  15. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,069
    Likes Received:
    37
    I'm confused about JC2... I can get my GTX460SLI setup to 850MHz at stock volts completely stable in Furmark/Kombustor, but load Just Cause 2, and it locks the system up after 15-20 seconds.

    It seems stable at 800MHz, however. Not tried raising Vcore.

    Also this is with Surround, which hammers GPUs a lot harder than with a single monitor.
     
  16. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    1,976
    Likes Received:
    101
    Surround on the GTX460SLI is something I am VERY interested in. Could you perhaps give a quick pointer on how the setup manages with JC2 and other modern games (if you play them)?
     
  17. isaac12345

    isaac12345 New Member

    Joined:
    20 Jul 2008
    Posts:
    427
    Likes Received:
    3
    More games, thermal and power consumption graphs would have been helpful.
     
  18. LeMaltor

    LeMaltor >^_^

    Joined:
    3 Oct 2003
    Posts:
    2,102
    Likes Received:
    25
    I'm thinking about one of these to play Mafia 2 on, it's very tempting >_<
     
  19. Teelzebub

    Teelzebub Up yours GOD,Whats best served cold

    Joined:
    27 Nov 2009
    Posts:
    15,796
    Likes Received:
    4,484
    I posted this a while back just some info I found on another forum.

    Yesterday, NVIDIA has released GeForce Beta v258.69 drivers which brings 3D Vision Surround and NVIDIA Surround to the market. The 3D Vision Surround is an unique feature where it offers stereoscopic 3D across 3 displays.

    To get it to work it would require Two GeForce 260 and above running in SLI configuration (which of course means that you would need a motherboard that supports SLI unless you are using GTX 295), and 2 GB of system memory. Three identical monitors with the same resolution, refresh rate, and sync poliarity. This ensures the monitors will operate without any synchronization issue. Currently, only Windows 7 is supported and 3-way SLI only is only supported for the GTX 400 series cards but NVIDIA plans offer 3-way SLI in the future driver release for the GTX 200 series cards (current driver will run 2-way SLI for GTX 200s). The current 3D Vision Surround cannot works in portrait mode due to the display and the 3D goggles have polarizing filters that will not work under portrait mode.

    For users who may not wish to spend money on 3 identical monitor to enjoy the 3D experience with multiple display, NVIDIA Surround is the solution where it will support multiple displays (3 to be exact) without the 3D experience.

    While NVIDIA is behind AMD bringing multiple display gaming experience to the market (AMD has Eyefinity with the HD 5000 series card for a few month already), it certainly is a step ahead of the competitor with the 3D gaming and multiple display. It will be interesting to see when will AMD going to bring such technology to the market.
     
  20. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,589
    Likes Received:
    231
    £185 gtx460 overclocked gives similar performance to 5870!

    great value, buy it people!

    hello, what have we here? does this mean 20 portrait + 30 landscape + 20 portrait (true_gamer's config) might be possible?
     
Tags: Add Tags

Share This Page