1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia’s GTX970 has a rather serious memory allocation bug

Discussion in 'Hardware' started by lancer778544, 23 Jan 2015.

  1. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561


    Not really Nexxo, as I planned to go SLI when I upgrade to 4k. Just as 2x 670s were a great choice back when I bought them to run 1600P, 2x 970s in SLI seemed like a great choice now. Still does actually... I would just like to have unimpeded access to as much VRAM as possible.. like say... 4GB :)
     
    Last edited: 27 Jan 2015
  2. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    Having not read the thread (how many forum posts have started with that and ended well? :worried:)

    I think its pretty awesome how nvidia have actually engineered the 970. It would essentially be a 3.5GB card but their architecture allows for half a gig of extra memory. Whilst it is slower than the other 3.5GB it's still faster than caching to system RAM. Its a significant bit of engineering that has suffered at the hands of a marketing cluster****.

    Ultimately the card performs better than if they had engineered it using the previous architecture. Which would definitely be 3.5GB and no bit of cachey goodness.

    I do think it was slightly, sort of missold. Sort of. My basis for that is looking at the cpu market, L1 through L3 are clearly marked as such. However, people seem quite happy for dual core gpus to have twice the memory on the side of the box than is actually usable. I mean that is an egregious level of misseslling in my opinion. Much more so than what is going on here.
     
  3. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561
    It's that slightly miss-sold aspect that just rubs me up the wrong way. Up to this point, I'm delighted with the card.. in fact, it's amazing. I just wish I knew for certain how it will behave when partnered with another and running 4K with it's RAM maxxed out.

    I'm also slightly troubled by the fact that some games don't seem able to use more than 3.5GB in some of the videos I've seen.
     
  4. Nexxo

    Nexxo * Prefab Sprout – The King of Rock 'n' Roll

    Joined:
    23 Oct 2001
    Posts:
    34,731
    Likes Received:
    2,210
    Well, the way I see it, you bought the card not knowing for certain, and you still don't know. Same difference.
     
  5. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    I can understand that. The question really is would you have bought it knowing it was a mostly a 3.5GB card. Its more rhetorical question. You've been influenced by the current situation so you could never really say for certain.
     
  6. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    I'm not too bothered about it. Sure it will run up to an allocation at 3.5GB of VRAM. But as of right now very few things even touch 3.5GB worth of VRAM. I'm just a little disappointed since I wanted that extra 1GB buffer as opposed to the 512MB i'm getting as gain from my GTX 780. At any rate, at 1920x1200 it's still an absolutely cracking card.
     
  7. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    If you're worried about the driver not correctly divvying up the vRAM, then you shouldld be running away screaming before even considering SLI. Scaling between two physically separate GPUs who talk to each other over two separate buses (the SLI bridge and the PCI-E bus) requires an order of magnitude more jigger-pokery with how tasks are scheduled between cards than does managing a single memory pool.

    Worrying about whether the drivers are assigning data to vRAM correctly when running SLI is like worrying over whether your engine timing belt setup is optimal when running a pair of motorcycles gaffa-taped together.
     
  8. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,247
    Likes Received:
    1,805
    'Only', eh? ;)

    I have to say I find those who think the 970 is only meant for running 1080p a little perplexing. The 970 is equivalent to a 780, which definitely isn't meant just to run things at 1080p (at least for the moment).

    I've yet to see what I would consider a proper investigation into the matter but from what I have seen so far, using that final 500MB doesn't influence performance. I think someone did the test on a 660Ti and found something similar? Well, I've had three of them maxing out their VRAM in the past and it didn't make the slightest difference to my eye.

    A final note for Pookey - based on what I've experienced, I wouldn't worry as I believe you'll run out of GPU power at 4K even with SLI'd 970s before you hit the VRAM limit.
     
  9. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561
    You miss my point. It's not really a 4GB card, or at least not in the way I thought it was. What I can be certain of though, is that running 4K is going to be VRAM intensive for a great many titles, and GIVEN THE CHOICE I'd have probably opted to NOT buy a card with 3.5GB of VRAM and some bolted on slower memory that shares a controller when I could have bought a card with unimpeded access to all 4GB at full speed. The fact is though... that's exactly what I, and everyone else thought they were buying: a 4GB card with no compromises to that 4GB of VRAM.

    This argument of yours is like selling someone 350g of chocolate when they ordered 400g, and making up the bulk with something else and then saying, "well... you enjoyed it didn't you? Still tasted as good as you thought it was going to didn't it?"




    No. In all honesty, if could have chosen between the "3.5GB + slow 512MB" 970 and a 980, I'd have bought the 980. This is my point. Had I all the facts, and hardware reviewers had all the facts, I'd have been able to make a choice. As it happened, I wasn't able to make a choice, because no one was given the facts.

    How come people seem to think it's OK to withhold relatively important information about products all of a sudden? The fact is, it's all very well people saying "Well, you didn't buy it because of how many ROPs it has" and other nonsense, but the fact is, many do, and many look at details like this.....

    [​IMG]

    ...that sites like Bit Tech always publish. That information was wrong. Had it said 56 ROPs... 3.5GB of fast DDR5 and 512MB of reduced bandwidth DDR5, then anyone considering 4K use WOULD have seriously weighed the 970 against the 980 in an entirely different way, and what's more, you know damned well that's the case. There's far too much pedantry in this thread from people who haven't actually even bought the ****ing card and should quite frankly know better, just for the sake of arguing.
     
    Last edited: 28 Jan 2015
  10. heir flick

    heir flick Minimodder

    Joined:
    2 Feb 2007
    Posts:
    1,049
    Likes Received:
    14
    i have to agree with pookeyhead, whilst i have had no problems with 970 sli at 1400p if i had known about the memory then i would have gone for a 980 and added another later.

    i belived i was buying a cut down version of a 980 with the same memory, mabe i was being stupid or just nieve
     
  11. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    No you're not, plenty of people will take it off your hands if the price is right. ;)

    It's more than awesome if you ask me, if it wasn't for the ability to segment the GPU in the way they have it would've been much worse than just having a 3.5GB card. AFAIK it would have meant disabling an entire bank of ROP's so it would have been a 3GB card with 1.5MB L2 cache, and a 192bit memory bus width.
     
  12. Nexxo

    Nexxo * Prefab Sprout – The King of Rock 'n' Roll

    Joined:
    23 Oct 2001
    Posts:
    34,731
    Likes Received:
    2,210
    You were making a subjective prediction of how a card will run in future 4k scenarios based on a subjective notion how 4Gb of VRAM is going to affect it. You still are. It's all still worry over what you think might happen. But if future-proofing of your card was important, you should really have bought the 980GTX. You didn't, because (understandably) you didn't want to spend that kind of money. You made a compromise on price-performance, and here it is --possibly.

    I don't think that the information was deliberately withheld. We're not saying it's OK; we're saying that we think it is not as big a deal as people make it out to be.

    Of course it would have affected people's choices to know the correct numbers, because people (including you and me) are like that: they look at a list of numbers and translate that as "more = better" (or "less = better", depending on). They make highly subjective judgements of what these numbers mean IRL. Knowing the correct numbers doesn't mean that they would have made a more objective, informed or valid choice, but they feel that they would have, and conversely they now feel they haven't. But I'm saying that just because it feels that way, that doesn't mean it's true.
     
    Last edited: 28 Jan 2015
  13. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    What you are saying here is that you base a purchasing decision on deep level architecture design and not how the card actually performs in benchmarks........ which is insane.

    No one should really care how Nvidia builds their cards - none of us understand the underlying architecture in any meaningful way anyway. Even professional reviewers didn't spot this. All we have to go on is benchmarks - which tell us how fast/hot/power hungry a card is, and then we make a buying decision based on this.

    I think just about every review for the 970 featured 4k benchmarks - if you didn't like the performance then, you shouldn't have 'cheaped out' and instead spent an extra £200 on the 980. If you did like the performance then - why do you care how Nvidia reach that level of performance?

    And also (as if you aren't already angry enough by now), I didn't know there was a rule about only commenting on hardware that you own?
     
    Last edited: 28 Jan 2015
  14. SuperHans123

    SuperHans123 Multimodder

    Joined:
    27 Dec 2013
    Posts:
    2,143
    Likes Received:
    391
    I have an MSI GTX 970 bought on launch day.
    Eats anything I throw at it, including Metro Last Light, which is a performance ****.

    You have to ask yourself why you buy these things in the 1st place.
    Is it to play games without slowdown and all eye candy or is it to read endless stats about ROPS and blibs and blobs?
     
  15. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    713
    Actually, it's not so insane. GPU are actually quite simple. The numbers tells us a lot about a card's capabilities compared to another similar architecture. GPU's scale linearly so a double in pipelines pretty much guarantees a double of performance, not considering resource limitations.

    Here is the problem: reading the numbers it is clearly shown that the resources are not limited in 970. Therefore 970 should perform identical to 980's performance at 80% due to 80% limitation of its number of pipelines.



    On the topic of future-proofing, my main concern is driver and game optimisation. True the card performs well at the moment and will do in the coming years. But will it perform just as well compared to 980 (80% is the figure I'm expecting) when it is 2 generations behind and some developers no longer tests for it? (Ubisoft?)

    Future-proofer should have purchased 980 doesn't really make sense. Of the same architecture (same die in this case) the performance difference should be consistent. So I paid for a 80% 980 and I will still expect a 80% 980 10 years on, without driver and game optimisations. Unfortunately I fear this won't be true.



    Not that I'm unhappy with the card (at the moment), I just wanted to point out why it can be disappointing for some. It's not a non-issue.
     
  16. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    Because cards in that price bracket are intended to be used for more than two years, right?

    If you spend upwards of 250 (USD, GBP or Euros) you will most likely spend the same amount again within the next two to three years. Why would Nvidia or AMD care about that card's performance in two or three generation's time?
     
  17. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,130
    Likes Received:
    6,717
    Ten years on, the card won't work for any newly-released games nor be included in driver updates - so performance will be moot. The GeForce 400 series was launched in 2010; it went EOL and was removed from Nvidia's driver programme in 2014.
     
  18. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,447
    Likes Received:
    5,851
    Because their customers do. I'd expect the resale value of my current card to offset part of the cost of my upgrade. If the view is that the card will be effectively worthless in two years, I wouldn't invest in one. Nor would, I imagine, quite a few others.
     
  19. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    713
    That illustrates exactly my point: Driver support will end sooner then later.

    Try to play a new game after driver support ends, will 970 still perform 80% of 980? Will 970 still perform to the same level compared to 980 it is benchmarked today?
     
  20. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    The handling of the segmented ram has nothing to do with how a game is programed.

    If a game requests X amounts of RAM, it's down to the OS (DirectX) and the drivers on how it handles this request, some games may request all available resources for caching purposes but this would provide it with an answer that there is 3.5GB available, if that runs out the drivers and OS release other lower priority ram for it's use.

    Seriously ? You are worried if a card will perform at 80% of it's more expensive brethren in 10 years time.
    I think you're going to have more to worry about if you're trying to play a modern game on a 10 year old card than a few percentage points of performance.
     
    Last edited: 28 Jan 2015

Share This Page