1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD will be first to DirectX 11

Discussion in 'Article Discussion' started by CardJoe, 29 May 2009.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,346
    Likes Received:
    316
  2. l1nk45

    l1nk45 What's a Dremel?

    Joined:
    12 May 2009
    Posts:
    9
    Likes Received:
    0
    woot, this is gonna be fun to see them battle again. Looking forward of getting the gen best graphics card =]
     
  3. EvilRusk

    EvilRusk What's a Dremel?

    Joined:
    23 Jan 2006
    Posts:
    110
    Likes Received:
    2
    But surely DirectX 11.1 will be released one week later and the Radeons will lose out as they don't support it! :p
     
  4. mjm25

    mjm25 What's a Dremel?

    Joined:
    19 Jan 2009
    Posts:
    507
    Likes Received:
    28
    It was nVidia that had all the issues with 10.1, deferred rendering and anti aliasing...
     
  5. Goty

    Goty Minimodder

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    See, the difference in that situation would be that NVIDIA would start paying developers to include the features instead of paying them to NOT include them, just so they look better in comparison again.
     
  6. sear

    sear Guest

    Really? Then how come there are games that exist that don't support anti-aliasing at all on the AMD side, i.e. Dawn of War II? Don't want to throw mud, but just saying, a lot of it depends on what developers choose to support - and most haven't even bothered with DirectX 10.1. The only game I can think of is Assassin's Creed, and in that case they actually removed the support in a later patch because of the glitches it introduced.

    We have made major strides in getting equal image quality and performance between different cards due to the adoption of APIs like DirectX and OpenGL, and in fact in most games you will be hard-pressed to tell any differences between cards these days outside of raw framerate numbers, the differences between which are often so small as to be unnoticeable. It really comes down to support, drivers and additional software. The advantages of going one way or the other are pretty slim these days outside of what driver control panel you want and whether or not you care about PhysX on your GPU (according to a recent Anandtech poll, most don't).
     
  7. Goty

    Goty Minimodder

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    Glitches it introduced on NVIDIA hardware, you mean.
     
  8. knutjb

    knutjb What's a Dremel?

    Joined:
    9 Mar 2009
    Posts:
    62
    Likes Received:
    0
    It appears to have been a good decision by AMD to experiment with the 40nm process so they can work the bugs out while readying the next gen chip. The history of whose on top has bounced between AMD/ATI and Nvidia and just because Nvidia has the very high end today but remains to be seen in October. They might still have the top card in the new bunch but AMD has been moving into the mid-low range where most people buy. Having the high end flagship certainly seems to have helped Nvidia so will AMD follow suit?

    I buy whoever has the best bang for my buck at the time I buy, not the fanboy thing. Though I do want to see AMD put out very competitive cards because that keeps prices competitive and that's good for all of us.

    Can someone show me why folding@home is such a big deal in a graphics card, does it improve my game playing in any way? So far I can't see it's importance to me or the reason for focusing on it, it looks like a Gee Whiz marketing gimmick.
     
  9. smithyandco

    smithyandco Can't afford a Dremel

    Joined:
    29 May 2009
    Posts:
    6
    Likes Received:
    0
    I'm still on DX 9.0c!
    Stopped my plans for an upgrade for now... then again as soon as I've bought a DX 11 Card MS will bring out DX 12! :(
     
  10. Evildead666

    Evildead666 What's a Dremel?

    Joined:
    27 May 2004
    Posts:
    340
    Likes Received:
    4
    A lot of the titles that accept AA on Nv but not on ATI are TWIMTBP games, ie Dawn of War II...
    Its in NV's interest to say "look, AA doesn't work on ATI" mainly because Nv hardware uses deferred rendering, so that it is impossible to compare direct screenshots between the manufacturers...

    The deferred rendering came in about when Nv was getting the sh*t kicked out of it for image quality issues....
     
  11. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    and doesn't amd hold the new crown for single gpu
     
  12. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    ATI is always first to adopt a new standard, just like AMD is typically always first to try something new for CPU, nVidia and Intel just sit back and see how it does before copying it.

    Either way awesome.
     
  13. Star*Dagger

    Star*Dagger What's a Dremel?

    Joined:
    30 Nov 2007
    Posts:
    882
    Likes Received:
    11
    Yeah it might just be best to Wait until HX7 is out, thats the direct X for holodecks!

    Never wait in computing you just shorten the amount of time your system is on top.

    I went from the 8800GTX to the Radeon HD 4870x2 and I will get one of these dual monsters in October!

    S*D
     
  14. Skiddywinks

    Skiddywinks Minimodder

    Joined:
    10 Aug 2008
    Posts:
    932
    Likes Received:
    8
    It's usually best to just set a rough upgrade schedule. I base it around always going for the final version of a technology when I can (Usually a smaller manufacturing process, lower voltages and temperatures, and higher clocks).
     
  15. outlawaol

    outlawaol Geeked since 1982

    Joined:
    18 Jul 2007
    Posts:
    1,935
    Likes Received:
    65
    This is why I haven't upgraded from my 8800GTX yet. DX11 cheesecake! Crappy high prices! Me have no money!

    :)
     
  16. cyrilthefish

    cyrilthefish What's a Dremel?

    Joined:
    15 Apr 2004
    Posts:
    1,363
    Likes Received:
    99
    should be interesting when it comes out

    http://www.theinquirer.net/inquirer/news/1137331/a-look-nvidia-gt300-architecture

    (i know it's the inquirer, so it is potentially iffy info :lol: )
    Essence of the story:

    -Nvidia whined at microsoft until they got some pretty big features removed from the DX10 spec so their cards would support it.
    -ATI cards are designed to support the original higher spec

    which leads to now
    -ATI cards are roughly DX11 compliant in hardware (DX11 is mostly what DX10 was until Nvidia threw a tantrum)
    -Nvidia cards will need to dedicate a lot of general purpose shaders to DX11 features as they don't have the hardware support, or completely redesign the chip in an incredibly short timeframe

    All in all, i'm quite worried of a monopoly forming in the graphics area in the future if nvidia gets that far behind, not to mention if Intel's larrabee turns out decent, giving Intel a monopoly of such magnitude it'll be terrifying to watch o_O
     
  17. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    It's written by Charlie, who lives in his own little world where ATI can do no wrong and Nvidia gets everything wrong. My favourite article of his was when he claimed R600 would be better than G80 and would 'kill it' (or words to that effect) in every benchmark. We all know how that turned out.

    He takes a lot for granted in that article, and I'd take it with a huge bucket of salt; he basically says that there's only a doomsday scenario and that Nvidia is dead in the water - I'm yet to see any evidence of that. That said, I am a little worried about their DX11 part and how it's going to turn out because it's quite a risky chip.
     
  18. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    The only worry I have for Nvidia is the lack of a multi-purpose shader for the tesellator..

    At any rate though Larabee, even if it's successful won't really catch on until it's affordable. Right now it looks really expensive(production wise) and if it was to continue, I'm guessing it would end up like the Caustic GPU.
     
  19. docodine

    docodine killed a guy once

    Joined:
    10 Feb 2007
    Posts:
    5,084
    Likes Received:
    160
    Haha, Google auto-suggests "Charlie Demerjian hates nVidia" when you search for his name.

    I'm just hopeful that I can' safely skip a couple generations, of graphics hardware. the GTX 4xx and HD68xx will be the cards for me.
     
Tags: Add Tags

Share This Page