1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Pascal and Polaris Discussion Thread

Discussion in 'Hardware' started by Parge, 25 Aug 2015.

  1. sharpethunder

    sharpethunder Minimodder

    Joined:
    25 Mar 2010
    Posts:
    156
    Likes Received:
    1
  2. Isitari

    Isitari Minimodder

    Joined:
    6 May 2009
    Posts:
    411
    Likes Received:
    90
  3. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,982
    Likes Received:
    3,746
    Well being a Fury X Crossfire owner (quite probably single Fury X owner in the coming days/weeks) I can safely say that as expected neither team have anything to offer me at all. And they won't until they release the replacements for what I'm using but I get an eerie feeling that £440 for the most expensive of the two will seem like chump change.
     
  4. Guest-16

    Guest-16 Guest

    This is my concern too. I'll be gutted if there's nothing 980 Ti level for less money/less power. I'm stuck on a 960.
     
  5. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    March next year is my guess for a 980ti replacement, longer if they do not feel the need to release such a model. £440 will be cheap for the 980 replacement I'd expect closer to 980ti prices.
     
  6. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,982
    Likes Received:
    3,746
    You misunderstood me dude. I paid £440 for the first of my Fury X cards about four days after it released and £390 for the second one a couple of months ago.

    What I meant is that the replacement for my card/s will be more expensive than ever, due to having 16gb of HBM2.

    The Radeon 490 and 1070/1080 will not be replacements for my cards. They will probably perform around the same, only have weaker memory buses. I mean yeah, they'll probably have more than the 4gb I have now but I doubt it will be HBM, meaning there is no reason for me to replace what I have until the big hitters come and I have a feeling we will be talking £1500.
     
  7. Guest-16

    Guest-16 Guest

    Yea they won't be HBM and yea they will be expensive - 16nm, HBM2, interposers are very expensive $$$. I'm not even sure you'll see GP100 on a gaming card tbh.
     
  8. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,216
    Likes Received:
    2,725
    So... I guess my twin 780Ti's will have to suffice for another year, then. :sigh:
     
  9. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    we will - but it`ll be 14nm and SK Hynix HBM2 ;)
     
  10. Guest-16

    Guest-16 Guest

    We haven't seen what Vega will be precisely yet apart from 4096 shaders. AMD have clearly marked it on their roadmap, but GloFo have yet to make a SoWoP (interposer) product to my knowledge - let alone a BIG one. (Fury-series was TSMC.) I can't see Vega being anywhere near 600mm^2 like GP100 is.

    Question is also if they're GDDR5 not GDDR5X, should we expect a faster 5X refresh with in ~6 months along with the GP100/Vega's in Q117?

    rollo - Price depends on relative performance to previous gen and Polaris 10 competition. If there's no high-end competition expect a price hike.
     
  11. Guest-16

    Guest-16 Guest

    bitsnchips claiming the 1080 gets GDDR5X while the 1070 gets faster 8GHz (vs current 7GHz) GDDR5.

    http://www.bitsandchips.it/9-hardwa...due-pcb-reference-base-gddr5-e-premium-gddr5x

    Here's my guess based on GP100 core, excluding all the connectivity and L2 cache size, which will obviously change:
    [​IMG]

    Nvidia will likely keep its core elements unified to simplify physical design and drivers, so 640-shaders/10GCs per cluster will be norm, but the physical size of each cluster will be smaller as the compute elements are removed (ala GK110b vs GK104 SMX), giving Nvidia smaller die-size advantage. I had originally put Nvidia down for 2-cluster GP106, however this leaves a BIG gap between GP104 and GP106 and that is a prime target for price-performance.
    So I expect Nvidia will go aggressive: it'll be 3 cluster with up to 1920 shaders - nearly 100% increase over the current gen x60/x50 series driving their perf. mainstream hard and bringing VR-capable territory down under $200. It could be we get a cut-down 1600-1800 chip first, then with a full 1065/Ti later. I also expect this because the GP106 will be the prime fit for notebooks (~1070M), so the silicon will be 192-bit bus native, but on desktop it will be cut to 128-bit as an upsell drive to 1070 sales. Finally, the big 1000-shader gap to GP100 is backed by +HBM2 +2xmemory capacity, which will offer a large to very large performance jump. It'll be north of $1000, possibly $1500 this time. Vega might keep their feet on the ground a bit but Nvidia will want the price as high as it can get away with here because yields will be low.
     
    Last edited by a moderator: 12 Apr 2016
  12. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Those figures if accurate would suggest around the 980ti performance depending on core speed and memory
     
  13. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    I think someone needs to tell Micron then that their own announcement that they sampled GDDR5X 2 weeks ago with mass production STARTING `in the summer` = july onwards , must be wrong

    http://www.anandtech.com/show/10193/micron-begins-to-sample-gddr5x-memory


    or either we wont see GDDR5X GTX 1080 before September or it`ll be GDDR5 - add memory to design , test and qualify design , start volume production. with computex 6 week away , again its either a paper launch or the products are GDDR5.
     
  14. Guest-16

    Guest-16 Guest

    :thumb: Core speeds should be faster than ever, but you will get even less voltage overhead for manual OC.

    Harlequin - Mass Production in July syncs to what leaks are saying. Announce in June, sale in July. Also totally depends on volume of samples. Having just the 1080 - an 'expensive' card - on 5X could be do-able with limited availability to start with (just like AMD w/HBM), given that "sampling" HBM2 is being fitted to GP100s as well.
     
  15. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
  16. Guest-16

    Guest-16 Guest

    Not for a year at ASUS, which means I get to speculate openly yayay. If I was still working there I'd just nip upstairs to ask and look at engineering samples :p

    GDDR5X is significantly easier to adapt production lines than the change of technology required for HBM though.
     
    Last edited by a moderator: 12 Apr 2016
  17. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    whilst it is easier to adapt per se - other forums have shown it still is a job as it requires a different layout etc , the traces are different. the card needs to be built from the ground up for 5X
     
  18. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    The chip needs to be different, but unlike HBM it does not require a process change to mount dies onto the interposer along with additional chips. Depending on the soldering chemistry between the die and the interposer, it may even involve some chemistry used in fabbing the die itself to allow a good bond.
     
  19. Guest-16

    Guest-16 Guest

    Report here on the supposed new chip IDs.
    http://www.hwbattle.com/bbs/board.php?bo_table=news&wr_id=18732

    GP 104-400 Reference -> AIB
    GP 104-200 Reference -> AIB
    GP 104-150 AIB

    Since the 150 is straight to AIB I would say this is possibly a cut-down 1060 Ti/1065 model. Not sure at all, too much depends on Nvidia's yields, costs, GP106 timeline etc. The silicon cost is fixed, so will be interesting to see how much Nvidia cuts it back.
     
  20. N17 dizzi

    N17 dizzi Multimodder

    Joined:
    23 Mar 2011
    Posts:
    3,234
    Likes Received:
    356
    It is speculation obviously, but if the 1080 = 980ti performance, what will be the incentive to "upgrade".

    Lower cost? Slightly better performance?
     

Share This Page