1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Reviews Zotac Gaming GeForce RTX 2080 Ti Amp Review

Discussion in 'Article Discussion' started by bit-tech, 26 Sep 2018.

  1. bit-tech

    bit-tech Supreme Overlord Staff Administrator

    Joined:
    12 Mar 2001
    Posts:
    1,712
    Likes Received:
    32
    Read more
     
  2. 23RO_UK

    23RO_UK Hasta Mañana

    Joined:
    4 May 2010
    Posts:
    4,126
    Likes Received:
    447
    A sub standard product rushed to the market by Zotac doesn't really bode well, their 10** and 9** series cards were of sterling quality - not enough stock of Nvidia's over priced PCB's to go around I suspect...
     
  3. Mr_Mistoffelees

    Mr_Mistoffelees The Lunatic on the Grass.

    Joined:
    26 Aug 2014
    Posts:
    1,545
    Likes Received:
    207
    £1224... :duh:

    GTX 1070ti £417 a few weeks ago. :grin:
     
  4. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,834
    Likes Received:
    198
    From Derbauer's testing with the 2080 OC, you gain more performance from focussing purely on memory OC and leaving the power target and core speeds/voltages stock (and on the standard fan profile!) then from core OC with power target maxed:
    [​IMG]
    Power-mods and sub-ambient gain effectively naff-all at the cost of wasted power.

    Looks like the optimum OC strategy is to push the memory has hard as possible, then inch up the core speed from there.
     
  5. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,856
    Likes Received:
    258
    I've never understood why the majority of partner cards always seem to avoid OC'ing the memory, especially considering (afaik) partners source their own RAM modules and could bin them for higher clocks.
     
  6. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,049
    Likes Received:
    34
    I suspect it's a combination of laziness - it's easier just to up the core clock a few tens of MHz and call it done - than it is to tweak the VRAM up and make sure it's still stable and not artefacting... and making sure that you can maintain a particular SKUs advertised clocks. It would be a bit embarrassing if a later batch of GDDR5(X) wasn't quite so good so you had to back off the default clocks. Not to mention potentially opening up to a court case.

    I had a pair of 4GB GTX670s at one point - bought at the same time, from the same place, similar serial numbers... one could handle a fairly serious boost on the VRAM clocks, the other would get flaky after a (by comparison) minor bump in clocks. They could both cope with essentially the same core clock speed. I know, I know, anecdotal evidence, but I'm guessing that one chip in the "poor clocking" card was right on the edge of stability just above its rated clocks and any more just pushed it that little bit too far.

    I've long had mixed experiences overclocking GPUs. My 7900GTO would overclock like a trooper; same with an 8800GT. A pair of 460's hated being overclocked (but the coolers were terrible so I wasn't surprised). The aforementioned 670s were a mixed bag. I had a 970 which did OK, while the Titan X (Maxwell) in my sig hated being overclocked, but it spent its life mostly on compute loads, so I wasn't worried.
     
    Corky42 likes this.
  7. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    5,923
    Likes Received:
    218
    Not all cards (of the same model) come with mem from the same manu, guessing due to changes in supply prices etc, but considering some mem acts very differently, some needing volts to clock further, others not etc, there's more potential variance so easier to leave stuff at 'stock'.
     
    Paradigm Shifter likes this.
  8. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,049
    Likes Received:
    34
    A much simpler (and shorter) way of explaining my waffle above. :D
     
  9. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    5,923
    Likes Received:
    218
    Haha, well yours is slightly different I got the feeling, as you were pointing out it's another thing to monitor, even if you have the same mem chips, doesn't mean they'll clock the same depending how far they want to push them.
    My point was an extra potential issue :p
     
  10. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,834
    Likes Received:
    198
    Same reason they are fitted with massively oversized inefficient coolers covered in plastic greebles and glowing bits: consumers perceive bigger = better, regardless of actual testing. Moar GPU clock therefore mean moar better.
    All the consumer cards will be using Micron's GDDR6 at launch (Samsung are supplying GDDR6 for the Quadro RTX cards), though that will likely change over time.
    For manufacturer overclocking, that's the same situation as always with GPU memory (apart from HBM): chips need to be binned by the vendor after receipt, which can either be on a dedicated rig (per chip binning) or just bung them all onto boards and post-tested to pick the cards that happen to end up with the better chips to sell as the higher SKUs (much cheaper than per chip binning, but 'wastes' some better chips on lower SKU cards).
     
  11. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    5,923
    Likes Received:
    218
    Fair enough, I was talking from a more general point with gpus, haven't paid any attention to the rtx launch.

    Still doesn't rule out there being variation over the coming months/years though no?
     
  12. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    4,251
    Likes Received:
    252
    Certainly doesn't, but it is up to the AIB (or their OEM depending on which AIB we are talking about) to order the memory in the first place, so it wouldn't exactly be a surprise to them.
     
Tags: Add Tags

Share This Page