Graphics What should I do? (980ti, Fury X)

Discussion in 'Hardware' started by GeorgeK, 17 Jun 2015.

Thread Status:
Not open for further replies.
  1. GeorgeK

    GeorgeK Swinging the banhammer Super Moderator

    Joined:
    18 Sep 2010
    Posts:
    8,705
    Likes Received:
    515
    I think I'll wait to see if this comes true :thumb:
     
  2. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,495
    Likes Received:
    5,907
    I honestly don't think the Fury X has done enough to spark a price war - If I was in the market for a new card, I'd probably go for the 980Ti. Of course, this could just be Nvidia turning the screw to kill off potential interest in the Fury X.

    I don't see them dropping the 970 or 980 by much - the top end 300 series is nothing new and still runs as hot and hungry as ever, and the middle order 300 series has been rebranded twice over, by the look of things.

    The fact that Nvidia can consider a blanket price drop just underlines how much price gouging they've gotten away with, IMO. Though I doubt AMD would be any different if they were in a position of strength.
     
  3. GeorgeK

    GeorgeK Swinging the banhammer Super Moderator

    Joined:
    18 Sep 2010
    Posts:
    8,705
    Likes Received:
    515
    I think it's close enough to cause a small price drop but not a lot. Either that or nvidia will price match the Fury X and take all of its sales haha
     
  4. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,495
    Likes Received:
    5,907
    Yeah, a £30-£50 drop on the 980Ti would probably turn a few heads. That article suggests they are focussing on the European market with the reduction, due to excessive gouging over here, no doubt!
     
  5. GeorgeK

    GeorgeK Swinging the banhammer Super Moderator

    Joined:
    18 Sep 2010
    Posts:
    8,705
    Likes Received:
    515
    nvidia dropped the msrp of the 980ti everywhere else almost straight away but not in europe... maybe they're just going to bring it in line with 'murica etc
     
  6. loftie

    loftie Multimodder

    Joined:
    14 Feb 2009
    Posts:
    3,173
    Likes Received:
    262
    I saw the reference ti on overclockers for the same price as the Fury. Tbh, fury is a good card but only for higher res, shame it's not consistent at lower resolutions. Currently no point going for it over the Ti though.

    [​IMG]
     
  7. damien c

    damien c Mad FPS Gamer

    Joined:
    31 Aug 2010
    Posts:
    3,004
    Likes Received:
    255
    Seems nVidia have dropped the price of the Reference GTX980Ti at Overclockers but it doesn't seem to have dropped anywhere else that I have checked just yet.

    http://www.overclock3d.net/articles...i_reduced_to_509_99_to_combat_the_r9_fury_x/1

    Just makes it even better for those who just want raw performance never mind the graphical advantages that Gameworks brings, although I know it's a sore topic with AMD users but for those on nVidia it's quite nice.
     
  8. Cei

    Cei pew pew pew

    Joined:
    22 Mar 2008
    Posts:
    4,714
    Likes Received:
    122
    I'm finding the Fury X interesting for what it does do well. HBM clearly offers benefits at 4K when dealing with huge textures, but the Fury X suffers at lower resolutions due to AMD's old architecture and core designs. I'm left wondering what NVIDIA's card with HBM2 will look like, when you couple the excellent 4K performance with simply better core design.

    In other words, I think the HBM2 card is going to be a screamer.
     
  9. Kovoet

    Kovoet What's a Dremel?

    Joined:
    26 Aug 2009
    Posts:
    7,128
    Likes Received:
    348
    Unfortunately this card has got my interest going again. Might just wait to see prices and sell both 970's and go with this card.
    Then hang on to what I got for at least a year
     
  10. David

    David μoʍ ɼouმ qᴉq λon ƨbԍuq ϝʁλᴉuმ ϝo ʁԍɑq ϝμᴉƨ

    Joined:
    7 Apr 2009
    Posts:
    17,495
    Likes Received:
    5,907
    You're a bloody nightmare, man!
     
  11. Krazeh

    Krazeh Minimodder

    Joined:
    12 Aug 2003
    Posts:
    2,124
    Likes Received:
    56


    While it is true that HBM has benefits, it would appear from reviews that it's not necessarily enough to offset the Fury X only having 4GB of ram. Hexus, for example, found that the 980 Ti produced smoother gameplay at 4k on several games, despite having a slightly lower fps.



    I agree tho with your view that NVIDIA's next gen cards (Pascal I think?) will potentially be very impressive. Especially if their stacked DRAM approach to HBM works as well as they claim it will.
     
  12. Cei

    Cei pew pew pew

    Joined:
    22 Mar 2008
    Posts:
    4,714
    Likes Received:
    122
    Last I heard, HBM2 will be 8GB limit, which sits about right for me, and should be what Pascal uses. Essentially AMD have been the guinea pigs, showing that HBM has its benefits, but needs improvement in some areas and needs to be coupled to fast cores, not just lots of lower performance ones.
     
  13. Krazeh

    Krazeh Minimodder

    Joined:
    12 Aug 2003
    Posts:
    2,124
    Likes Received:
    56
    As I understand it, NVIDIA will be pursuing a different approach to HBM by using stacked DRAM directly on the gpu rather than an interposer. Will be interesting to see what benefits that brings and how much ram they can stack.. Altho I'm sure I've read somewhere where they hinted that a Pascal based Titan card could have upto 32GB.
     
  14. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,390
    Likes Received:
    63
    I think Techreport probably covered this better. Frame time variance is the culprit. Probably fixable with driver updates
     
    Last edited: 25 Jun 2015
  15. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    They are meant to have a driver update due around mid July. But if that's the case why did they not delay and they could of had some better reviews.
     
  16. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Do you have a source on this? The HMB2 demonstrator with Pascal ues an interposer, and stacking HBM 2 modules directly on the GPU die would have a number of issues:
    - Thermal dissipation problems for the GPU (and added heating for the HBM modules)
    - Mechanical clamping strain on the solder balls connecting the HBM modules and GPU (There's enough trouble from in-package stacking resulting in failures for laptop GPUs)
    - Lack of area on top of the GPU die for HBM modules
    - Utilising TSV technology in a GPU (has not yet been demonstrated)
    - Utilising metalled connections on BOTH sides of the die (has not been demonstrated anywhere, ever, outside of HBM stack elements)
    - Wasted GPU die area for TSVs resulting in wither a larger and more expensive die, or less area for the actual GPU
     
  17. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,390
    Likes Received:
    63
    The worst thing you can often do is nothing. Delaying is just nothing until the new drivers are out. During that time, they could have made sales from their loyal consumers. Most reviews are happy with this card also. It's just a shame AMD released those benchmark figures, and the card doesn't seem overclockable...
     
  18. GeorgeK

    GeorgeK Swinging the banhammer Super Moderator

    Joined:
    18 Sep 2010
    Posts:
    8,705
    Likes Received:
    515
    I've come to the decision that I'm going to stick with my SLI 780s for now - having compared my benchmark results on Heaven and 3DMark with some of the results coming out now my setup only gets beaten by a 980ti at 4K which I don't intend to move to any time soon. Power consumption obviously would be better but meh - the only thing I'm tempted to do in order to lower my temps is fit a couple of HG10s to my cards...
     
  19. rainbowbridge

    rainbowbridge Minimodder

    Joined:
    26 Apr 2009
    Posts:
    3,171
    Likes Received:
    69
    I hope to get my 980ti soon, scan shows as ETA 31st, they have had my pre order for a number of weeks now so hopefully stock management aspects are good, they had pre orders of nearly 100 hybrids

    Ref why go NVidia.. This was the reason I was saying about going NVidia for the next 12 months (as VR comes in).

    Reading this makes me feel absolutely wonderful about going NVidia.


    I never thought Id say it (seriously never), but I expect to go SLI in the future, that is pretty much a certain, VR hitting 90frames per second.



    Virtual Reality (VR) is poised to change the way we play and experience games by placing us in the games, directly in the action, with a full 3D world all around us. As your head moves your view changes, and as you use special VR controllers in compatible games your avatar reaches out and naturally uses their hands to interact with objects. It sounds like a gimmick, but once you’ve experienced the unprecedented immersion and realism that VR delivers you’ll ‘get’ it, and you’ll instantly want your very own VR headset.

    For VR to deliver that immersive experience a considerable amount of GPU processing power is required. As an example, the upcoming Oculus Rift will run at 2160x1200, at 90 frames per second, which is a little over three times as demanding as a 1920x1080 monitor running at 30 frames per second. If a consistent 90 frames per second isn’t achieved the game will stutter and input lag will occur, ruining the experience, straining eyes, and potentially causing headaches.

    To this end, NVIDIA has been working to increase performance and improve the VR experience.



    further reading... this is really interesting nice page by nvidia

    http://www.geforce.co.uk/hardware/technology/vr/technology




    Virtual Reality (VR) is poised to change the way we play and experience games by placing us in the games, directly in the action, with a full 3D world all around us. As your head moves your view changes, and as you use special VR controllers in compatible games your avatar reaches out and naturally uses their hands to interact with objects. It sounds like a gimmick, but once you’ve experienced the unprecedented immersion and realism that VR delivers you’ll ‘get’ it, and you’ll instantly want your very own VR headset.

    For VR to deliver that immersive experience a considerable amount of GPU processing power is required. As an example, the upcoming Oculus Rift will run at 2160x1200, at 90 frames per second, which is a little over three times as demanding as a 1920x1080 monitor running at 30 frames per second. If a consistent 90 frames per second isn’t achieved the game will stutter and input lag will occur, ruining the experience, straining eyes, and potentially causing headaches.

    To this end, NVIDIA has been working to increase performance and improve the VR experience.

    GeForce GTX GPUs

    2nd Generation NVIDIA Maxwell-based GeForce GTX GPUs are built to deliver the raw frame rates and high resolution required for demanding VR experiences. With full support for the next generation DirectX 12 graphics API, and a Maxwell multi-projection architecture that enables new rendering techniques for VR, GeForce GTX GPUs are designed for amazing VR experiences.

    GameWorks VR

    To help improve the VR experience, we’ve created “GameWorks VR” a software development kit (SDK) for VR headsets and game developers that improves performance, reduces latency, and improves compatibility. GameWorks VR includes the following technologies:

    Multi-Res Shading: An innovative new rendering technique for VR whereby each part of an image is rendered at a resolution that better matches the pixel density of the final ‘warped’ image (VR images are an oval instead of a rectangle, as seen on a desktop monitor). By using Maxwell’s multi-projection architecture to render multiple resolutions in a single pass, substantial performance improvements are seen in VR games.
     
  20. damien c

    damien c Mad FPS Gamer

    Joined:
    31 Aug 2010
    Posts:
    3,004
    Likes Received:
    255
    So I wonder how long it will be after the 1st GameWorks VR title launches that AMD start crying and saying they cannot optimise, the game because of GameWorks.

    Still not sure about VR myself but will be interesting to see what it's like when it finally lands with decent games, and the hardware can produce smooth gameplay.
     
Thread Status:
Not open for further replies.

Share This Page