1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia ends support for DirectX 10 GPUs

Discussion in 'Article Discussion' started by Gareth Halfacree, 14 Mar 2014.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,066
    Likes Received:
    6,610
  2. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    Can't say I blame them. They release cards like chit through a goose so supporting all of them with one driver must be a right old Peter.

    AMD ended support for the DX10 cards they made a while back too, removing any Crossfire support in later drivers.

    I still think GPU producers should have separate teams of devs writing drivers for one specific core format (IE Kepler, Fermi etc).
     
  3. Baz

    Baz I work for Corsair

    Joined:
    13 Jan 2005
    Posts:
    1,810
    Likes Received:
    92
    The GTX 280 was one of the first products I reviewed at bit-tech, and I still have bit's old GTX 275 collecting dust on a shelf. Goodbye old soldiers.
     
    N17 dizzi likes this.
  4. badders

    badders Neuken in de Keuken

    Joined:
    4 Dec 2007
    Posts:
    2,642
    Likes Received:
    74
    I'm still running a 250GTS!
     
  5. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    They were utter crap.
     
  6. Gundam God

    Gundam God What's a Dremel?

    Joined:
    9 Sep 2006
    Posts:
    26
    Likes Received:
    0
    Still running an 8800 GT GS. Still don't have any real reason to upgrade yet though I might update the drivers soon, can't actually remember the last time I did that.
     
  7. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,266
    Likes Received:
    865
    My 8800GT died of old age about a year ago. That was an awesome card, would still happily be running it now had it not expired :(
     
  8. schmidtbag

    schmidtbag What's a Dremel?

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    If any of you are using linux, they simply moved the DX10 cards to their "legacy" drivers. They still update those drivers, just not as frequently. So, if you'd like to keep using those GPUs in the years to come, this would be a good opportunity to give linux a shot. You're obviously most likely going to get a better experience with newer GPUs but linux works very nicely with nvidia. In fact they still support the GeForce 6 and 7 series.

    I personally still own a 7900GTO, but it's currently in a staticproof bag sitting in a shelf. I wish it was CUDA compatible, because it'd make a great GPGPU - the core is still pretty good even for today's standards but 256MB of VRAM really isn't enough to play games, but sufficeint for non-gaming purposes.
     
  9. Umbra

    Umbra What's a Dremel?

    Joined:
    18 Nov 2013
    Posts:
    636
    Likes Received:
    17
    Still use my BFG 8800 in a back up/emergency PC, I bought it new £140 and fitted a Zalman heatsink/fan to it, played over a 1000 hours of Oblivion with it, brilliant card, completely useless for today's AAA games of course :lol: but, Ah, the memories.

    [​IMG]
     
  10. maverik-sg1

    maverik-sg1 Minimodder

    Joined:
    18 Aug 2010
    Posts:
    371
    Likes Received:
    1
    The only surprise to me is that they haven't discontinued them sooner - probably console ports were propping up the driver support?

    I always perceived DX10 as bit of a mis-fire, not a massive jump over DX9, so not invested in, DX11 adding tesselation allowed console ports to be easier on the eyes, we'll finally say goodbye to DX9 games over the next 12-18mths though in favour for DX11 console ports....which should keep us going for the next 4 years lol.
     
  11. schmidtbag

    schmidtbag What's a Dremel?

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    I too am surprised they haven't done this sooner. The reason DX10 was a mis-fire is because MS didn't release DX10 for Windows XP, and, ATI took a relatively long time to add the support for it. Also, consoles didn't support the technology that DX10 offered so that also didn't really help much.

    DX11 worked out because people were actually willing to switch to Windows 7 and the hardware performance was a big enough jump over the DX10 generation products. But, even DX11 is relatively uncommon.
     
  12. SMIFFYDUDE

    SMIFFYDUDE Supermodders on my D

    Joined:
    22 Apr 2009
    Posts:
    2,898
    Likes Received:
    104
    I hate Nvidia drivers, I have to go way back to 314.22 for a driver that doesn't try to murder my GTX 560 Tis but doing that means I can't use SLI in some games.
     
  13. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    They had done it sooner, just not officially. When BF3 launched I was running a triple screen Quad SLI PC (two GTZ 295 single PCB). BF3 did a lot of this....

    [​IMG]

    [​IMG]

    I mean the shadows of course. So I waited for a fix. And waited, and waited. Nvidia just said that they pretty much couldn't be assed with older cards.

    Not nice if I'd paid over five hundred notes per card and this wasn't even two years into their lifespan. Thankfully each card cost me peanuts but I was still rather cross over it.
     
  14. LordPyrinc

    LordPyrinc Legomaniac

    Joined:
    7 Mar 2008
    Posts:
    599
    Likes Received:
    6
    I've got a perfectly functional GTX 275 laying about. It's got the oddball 768 MB of RAM. Last time I hooked it up was when I was first playing around with SLI with my 550s. Used the 275 as a dedicated PhysX card. Took it out when I dropped the 660s in.
     
  15. schmidtbag

    schmidtbag What's a Dremel?

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    If you have another PCIe slot you can still use it for physx. You don't need matching GPUs to accomplish this.
     
  16. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,785
    Likes Received:
    103
    Still running a GT320 and a 7800GT here.

    Of course, the only game I play is World of Tanks, and while I could use an upgrade, it's not a priority.
     
  17. SimoomiZ

    SimoomiZ What's a Dremel?

    Joined:
    2 Feb 2008
    Posts:
    65
    Likes Received:
    2
    The joys of proprietary DX + driver model.

    Having a locked down proprietary model allows a company to degrade performance in the latest drivers artificially... or end support altogether - despite the fully unified shader GPU microarchitectures being similar, there's no reason for G80 cards to become totally obsolete.
     
  18. Guinevere

    Guinevere Mega Mom

    Joined:
    8 May 2010
    Posts:
    2,484
    Likes Received:
    176
    Maybe, but many a benchmark says you'll get the best performance by allowing the physx (when you're actually playing game that has some) run on the main card(s).

    /JustSayin
     
  19. Pookie

    Pookie Illegitimi non carborundum

    Joined:
    4 May 2010
    Posts:
    3,557
    Likes Received:
    146
    I thought they already would have done this ages ago! AMD stopped updating it's DX10 cards atleast 18 months back.
     
  20. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    Guys the Physx PPU seems to react to clock speed. So, putting a GTX 275 in as a Physx card when you are running a 1100mhz Kepler is a waste of time. It'll simply slow you down.

    It seems Nvidia have piped the PPU onto the die itself, making it better with a better core (if that makes sense). I tried this theory a couple of years back and you were literally just adding a spare furnace to your rig for no gains whatsoever.

    If you run SLI then setting it to the second GPU did seem to help matters, rather than letting the master GPU do all the work.
     

Share This Page