1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia’s GTX970 has a rather serious memory allocation bug

Discussion in 'Hardware' started by lancer778544, 23 Jan 2015.

  1. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    I grabbed that OCUK 970 (ITX build, needed a blower that wasn't complete pants), because the extra £120-odd for the 980 wasn't worth the slight increase in performance,. Still isn't; the FCAT benchmarks run on release of the 970 vs the 980 at UHD still hold today.
     
  2. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,282
    Likes Received:
    887
    Here's a question for you - has the performance of your cards changed since this issue came to light? Were you happy with them beforehand? What have you lost since this issue was discovered? The way some people are carrying on, you'd think that their 970s have gone overnight from a great card to an expensive paperweight.

    If you bought cards based on the fine detail of the specs, rather than the actual performance of the cards, then that's pretty bizarre (although quite typical of the PC industry in that it focuses so much on features rather than benefits). On the other hand, if you bought the cards because they were assessed to perform at a certain level then the cards still perform as tested and reviewed.
     
  3. N17 dizzi

    N17 dizzi Multimodder

    Joined:
    23 Mar 2011
    Posts:
    3,234
    Likes Received:
    356
    To answer your question; I have spent most of the time running one 970, and so much lowered graphical settings @ 3840 x 2160. I have not seen any problems thus far, but equally I have not been running games such as far cry 4 or SoM at very high or ultra. In fact, I have not been gaming that much at all over the holiday period. My second card is due back tomorrow.

    I will be the first to celebrate if this is a non issue, but Anandtech et al suggest this is possibly going to be a problem if the driver team do not optimize properly for games using 4 gigabytes of VRAM regularly. Had I known this I likely would have invested in a 980 or two.
     
  4. Nexxo

    Nexxo * Prefab Sprout – The King of Rock 'n' Roll

    Joined:
    23 Oct 2001
    Posts:
    34,731
    Likes Received:
    2,210
    Well, if performance in highly demanding games is that important to you, you were silly not to buy the top end card in the first place.
     
  5. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,282
    Likes Received:
    887
    So have you already been running SLI 970s? If so, was the performance acceptable? There's going to be zero difference whether this issue was known or not, so the cards will still be exactly as good as they were last week.

    On the other hand, if the two 970s weren't cutting the mustard previously, then the fact that they aren't good enough now isn't down to nVidia keeping some dastardly secret from you, it's because the cards (as tested, benchmarked and reviewed previously) aren't sufficiently powerful for your requirements.

    Basically, the performance of these cards hasn't changed, and is still consistent with the benchmarks and reviews previously published. If that performance isn't good enough for your purposes then you bought the wrong cards, as Nexxo says.
     
  6. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    But from another point of view, I can see why people are getting upset. They bought a 4GB card expecting all 4GB at maximum memory bandwidth to account for their use-case. They feel they are mislead.

    It's like paying for a 16GB mobile phone only to find out you can only get 12GB of usable storage..... oh wait!



    Personally, I'll be happier to find an option in the driver to disable the slower 0.5GB of VRAM, thus games get 3.5GB of VRAM reported back and allow the game to try to optimize itself. To be honest I managed 1.5GB GTX 580 before, I'd be much happier with a 3.5GB 970 that doesn't rely on driver optimisation.

    Ah well, I'll just wait for my $10 compensation :p. Considering I paid £251 for the card, then sold the game voucher for £21. I don't feel cheated. If I had paid £260 without game voucher like they are now, I'd be slightly pissed.
     
  7. heir flick

    heir flick Minimodder

    Joined:
    2 Feb 2007
    Posts:
    1,049
    Likes Received:
    14
    I was thinking the same, from what ive read when the vram goes past the 3.5gb and start to use the slower 500mb then you get stuttering, now if the gimped 500mb was disabled then sure frame rates could drop but at least it wont stutter.

    this is pure guess work by the way
     
  8. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Disabling the final 500Mb would only make things worse. Anything beyond 3.5Gb would, instead of sitting in the 500Mb on-card portion of vRAM, now be paging from system RAM over the PCI-E bus.
     
  9. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    711
    Ok, I worded it badly.

    I want a way in nV control panel to allow me to choose whether the driver reports 3.5GB or 4GB to games.

    This way the game will be in a position to accurately allocate resources rather than try to use the full 4GB and the driver does its guesswork. Of course over the last few days, we see some games already does this like Watch_dog. But what if in 2018, developers no longer see a need to test on old cards? That leaves us to the mercy of driver optimisation if we are lucky, or pure guess work if we are not.
     
  10. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Pure grade-A baloney.

    Gaming at 4k isn't some mystical thing. Pixel-based operation (shaders, final readout) scale with resolution, but other operations (physics, textures, etc) do not. Whether a given GPU can game at 4K is entirely dependant on the game used and the settings used in that game. For example, an old GTX 460 768MB can happily play HL2 at 4K at well above 60fps with all settings turned up to max.

    Whether you can game at 4K is 100% down to the settings you use.
    You want to play the latest grey-guys-fight-other-grey-guys-in-a-brown-world shooter with all settings turned up to ultra at 4K? Yeah, a pair of high end cards may be needed (and it'll still run terribly due to scaling issues based on a short development cycle focussed on optimising for lower-end GPUs). At more reasonable settings (say, shaders on 'very high' rather than 'ultra'), 4K will run smoothly on high-end ult-GPU setups, but will run just fine on upper-mid single GPU setups as well.

    There are numerous websites that have tested the 970 at 4K (with and without FCAT), and it easily trades blows with the 980, 780 and 780ti depending on overclocking and specific game settings.
     
  11. Hustler

    Hustler Minimodder

    Joined:
    8 Aug 2005
    Posts:
    1,039
    Likes Received:
    41
    My advice to anyone is if you see some idiots offloading 970's at silly prices because of all this nonsense, bite their hands off and grab them, because take it from me, they're awesome all the way up to and including 4K.
     
  12. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    Complaining about this issue is absolute madness! 4K benchmarks were in every review for this card (including Bit Techs own). If people didn't like them then, they shouldn't have bought the cards in the first place!

    NONE OF THIS CHANGES THE PERFORMANCE OF THE CARD FROM THE ORIGINAL BENCHMARKS WHICH YOU SHOULD HAVE USED TO DECIDE TO BUY THE CARD OR NOT*

    *this is so painful I'm using one of my three per annum allowance of capital letters.
     
  13. Kronos

    Kronos Multimodder

    Joined:
    6 Nov 2009
    Posts:
    13,495
    Likes Received:
    618
    Well I cannot bite the hand due to the lack of teeth, but I will certainly give it a bloody good suck. I am looking to buy in a couple of weeks so bring them on. Still cannot decide on which one though.
     
  14. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    I have to laugh - so many calling a £300 gtx 970 ` mid range`
     
  15. Cei

    Cei pew pew pew

    Joined:
    22 Mar 2008
    Posts:
    4,714
    Likes Received:
    122
    Eh, blame NVIDIA and AMD for that. They shifted high end to the £350+ category a few generations back.
     
  16. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561
    I'm happy with it's performance NOW.... yes.. it's brilliant, but I'm only running 1600P. Hardly anything I run gets close to 4GB. 4K is a different matter. I've no idea how it performs then.. it's not happened yet.

    I didn't know that that last 512MB of VRAM was 7 times slower though.

    I'm stuck with it, so I hope you're right. Had I know the above, I may have just waited a little longer and got a 980 to rule out any doubt. The fact is, I didn't know, and no one else did either... except NVidia.
     
  17. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,390
    Likes Received:
    63
    I thought this was an issue when it was possible that you couldn't use over 3.5GB, but now after seeing the response, reading at main sites and revisiting tests I can't see what the fuss is all about. Particularly going over on to Tomshardware who did testing of Shadow of Mordor recently.

    Also Cei makes an excellent point. The real issue that needs to be addressed isn't even related to the 3.5GB + .5GB fiasco. It is the fact that we plainly let both AMD & Nvidia get away with implying that their dual GPU cards have twice as much memory when we know exactly what they intend to convey to those not in the know. It's a genuine play to pull the wool over some people's eyes.

    Edit: Although if I was in Pookeyhead's shoes targeting a 4k build, it would be a limitation I'd rather not live with.
     
  18. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    im sure NVidia love you - as you also call the GTX 680 `mid range` by your number crunching!
     
  19. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561
    That's my precise point. I'd have liked to have known, and made an objective choice based on real facts. I was denied that opportunity. That is starting to bother me.
     
  20. Nexxo

    Nexxo * Prefab Sprout – The King of Rock 'n' Roll

    Joined:
    23 Oct 2001
    Posts:
    34,731
    Likes Received:
    2,210
    If your aim was 4k game play, you should have bought the top end card.
     

Share This Page