1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Pascal and Polaris Discussion Thread

Discussion in 'Hardware' started by Parge, 25 Aug 2015.

  1. phinix

    phinix RIP Waynio...

    Joined:
    28 Apr 2006
    Posts:
    6,000
    Likes Received:
    98
    So it looks like lower cards will come with GDDR5, then 1080 (980 equiv. = 980Ti perform.) will come with GDDR5X, then next year we should see 1080Ti with HBM2.
    Is that correct according to all roumors?
     
  2. Guest-16

    Guest-16 Guest

    Looks like GP100 will never be used as a graphics chip as it doesn't have any ROPs. Nvidia's whitepaper doesn't mention any, and that's an official technical document that should detail all the parts rather than selective marketing. ROPs are a big transistor and space drain so if they're focusing on a compute-only core it makes sense to leave them out if they'll never be used for Compute tasks. We can only guess they are cooking a GP101/200 with cut-back FP64 and added ROPs to match Vega? Who knows.

    http://images.nvidia.com/content/pdf/tesla/whitepaper/pascal-architecture-whitepaper.pdf

    Which is why we're not seeing the big jump in SM count we expected - it's all going into registers and caches (power saving/efficiency). Have we hit peak shader?
     
  3. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Not look at the PDF but wouldn't they just have disabled the ROPs and some other things?
    It just seems odd to have two separate designs rather than just enabling or disabling what you do or don't need.
     
  4. Guest-16

    Guest-16 Guest

    Could well be, unless they wanted to absolutely maximize the available area with compute and not waste it with knowingly disabled parts. Depends how they approach the whitepaper.
     
  5. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    It may be that Maxwell's shaders were underutilised, so doubling the available warp schedulers and cache should allow more the shaders to be in active use at any one time.

    IIRC, ROPs are usually joined to the memory controller (and sit between it and the crossbar along with the L2 cache) rather than being part of the SMs. It's possible they've been omitted for P100, or they may just be disabled for binning, or just inactive but present. As far as I am aware, neither Nvidia or AMD have ever designed a GPU without any ROPs, so it would be surprising to create a GPU lacking them entirely. GPU Compute still treats compute operations through the eyes of graphical operations, so if the ROPs were removed some other function block would need to be designed to do many of the same jobs.
     
  6. Guest-16

    Guest-16 Guest

  7. Harlequin

    Harlequin Modder

    Joined:
    4 Jun 2004
    Posts:
    7,131
    Likes Received:
    194
    that's reminded me - Nv started pushing FP16 as a performance `fix` for the GeForce FX....
     
  8. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Anyone think the launch maybe quicker than we think, Finding a Nvidia 980 instock is hard work, Scan had about 30 models 2 weeks back. They now have 2 in stock. Overclockers have 0.
     
  9. Guest-16

    Guest-16 Guest

    EDIT: I was probably wrong!

    I hope so! Maybe GDDR5 versions with 5X later: there was a leak of the MSI card out there already. Either that or everyone cut too short on inventory management.
     
    Last edited by a moderator: 27 Apr 2016
  10. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    I thought the whole DX9 shader debacle was down to AMD actually listening to Microsoft with their DX9 spec and making cards that handled 24bit shaders well. Bit groggy now but iirc it was Nvidia's insistence on using 32bit that somehow made their pipelines sub optimal for processing the 24 bit shaders, and why HL2 had to run in DX8 mode on FX cards.
     
  11. Guest-16

    Guest-16 Guest

    Yes.. you're scratched some grey cells now - that does sound more familiar. Damn my memory is so poor.
     
  12. Yaka

    Yaka Multimodder

    Joined:
    26 Jun 2005
    Posts:
    2,285
    Likes Received:
    388
    Yep, that was the only time i had amd cards. I think it were the 9800pro and the 9500pro. Think the 9500pro has very tweakable and could be flashed with the 9800 bios. Oh and wernt they just ATI at that time?
     
  13. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    It's amazing, even at times when my brain fails in helping me do even simple tasks, I can still explain things like that.

    Yes they were.
     
  14. Guest-16

    Guest-16 Guest

    I easily mix up memories. FX series was before I started reviewing full time. DX 9.0c with HDR was ~06, but I wasn't doing graphics cards so hard to remember. Talk to me about motherboards and CPUs, ill remember that.

    Yea, so, FX series couldn't do 24-bit, but the 6000 series that followed had 32bit but couldn't do 24bit. ATI 9700 and X800 series both had 24-bit so went from good-to-outdated quickly as shader model went from 9.0b to 9.0c. You couldn't tell the difference between integer (ALU) based HDR processing and FPU based though, but FPU performance hit was high. In those days you could wind up the Nvidia PR so much he'd spin for days. :lol:

    I had a 9700 LE which I replaced with an FX 5950 Ultra (thank you, Gainward). It wasn't bad actually. Played UT2k3 very nicely. Quickly swapped it for a 6800 though.
     
  15. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    There must have been so much to remember in those ten years.

    Very interesting read, I didn't know that much detail about HDR beforehand. I do seem to remember though that the FX series didn't run DX9 mode very well in HL2 before HDR was implemented.

    There was a point that I could play it in DX9 on my 5700
    (I remember reading the advice to wait a couple of weeks before buying a card, but at the time I figured I would have probably spent the money by then so I went with a 256MB model that I later found out ran the memory at half the speed. Thanks Gainward.... Lesson learned)
    But it couldn't deal with properly rendering anything above water when Gordon was submerged. Played UT2003 very well though. From what I remember the only noticeable difference I saw between the FX and my old Geforce 3 was the moving blue effect in the health pickups.

    A quick google search brought me to this article...
     
    Last edited: 28 Apr 2016
  16. Kronos

    Kronos Multimodder

    Joined:
    6 Nov 2009
    Posts:
    13,495
    Likes Received:
    618
    Rather than go through this long thread, is there a definitive release date for the GTX1080 as I am looking to purchase a 980Ti and I have been advised that prices for same are likely to fall?
    But I don't do waiting very well particularly when the waiting time is unknown.
     
  17. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Short answer is no.

    Long answer is that rumors and speculation are pointing towards something being shown of at Computex (May 31st, June 4th) although when there's going to be stock available is unknown as it could be a paper launch, having said that rumors also point to NVidia having already stopped production on some Maxwell products like the 980.
     
  18. Kronos

    Kronos Multimodder

    Joined:
    6 Nov 2009
    Posts:
    13,495
    Likes Received:
    618
    Thanks, in that case I will not incorporate the next gen into my research and just plan on getting a Evga 980Ti sooner rather than later.
     
  19. Guest-16

    Guest-16 Guest

    Something will happen June-July.
     
  20. LennyRhys

    LennyRhys Fan Fan

    Joined:
    16 May 2011
    Posts:
    6,398
    Likes Received:
    887
    Yep late 2006 was when ATI had the X1900 and released the X1950 series cards. I remember that time well because that's when I really started getting into hardware/PC building and I always had to get one better than my mates. When one of my mates bought the X1950Pro (AGP, no less) I managed to get an X1950XT.

    Despite ATI having just released their flagship DX9 cards, the 8800GTX was released one month later in Nov 2006, and with it the first hardware support for DX10.

    Fond memories! :lol:
     

Share This Page