1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Farewell to DirectX?

Discussion in 'Article Discussion' started by arcticstoat, 16 Mar 2011.

  1. eddtox

    eddtox Homo Interneticus

    Joined:
    7 Jan 2006
    Posts:
    1,296
    Likes Received:
    15
    I'm with the naysayers on this.

    Programming direct-to-metal is only feasible when there are limited hardware configurations to consider (i.e: consoles/iphones/ipods etc) and/or a large enough market for each to make it profitable. When you have to account for hundreds of different possible options the development time and therefore costs become prohibitive. This also applies to patches etc, and if you happen to upgrade to a newer card. you're up the creek without a paddle.

    BAD idea from someone who ought to know better.
     
  2. AstralWanderer

    AstralWanderer New Member

    Joined:
    17 Apr 2009
    Posts:
    749
    Likes Received:
    34
    Doesn't that describe HLSL (High Level Shader Language) and Nvidia's CG?
     
  3. vampalan

    vampalan New Member

    Joined:
    18 Jun 2003
    Posts:
    225
    Likes Received:
    1
    It's good idea if all the hardware was the same or limited pool (aka Apple Mac), it it's not like that in reality.

    The quickest way to alienate casual players is to have hardware specs. I don't think many casual gamers would be willing to upgrade to the gaming specs, thus, the console's "just works" wins always. I've seen a few gamers go from PC to console for the "just works" experience.
    Also the consideration of cost of hardware, person with a fast PC bought not too long ago now doesnt play their games also on the PS3/or what ever, needs to buy a bunch of upgrades for their PC at great cost compared to the console they probably already have, you know which one wins. Also a lot of people now have laptop computers over desktops as their current main computer.

    So the quickest way to kill the PC gaming market is to have a bunch of requirements that most people dont have.
    It's cool to have a game that looks 10 times better than a console, dont see developers actually selling many copies of that game on the PC. And in the end you will only be able to buy AAA games on consoles, you can already see this trend today, with staggered releases or no shows.
     
  4. Ayrto

    Ayrto New Member

    Joined:
    20 Jun 2007
    Posts:
    255
    Likes Received:
    3
    Well top names are already seriously considering this . Tim Sweeney of Epic games(unreal engine) said:


    "There are significant advantages in doing it yourself, avoiding all the graphics API calling and overhead. With a direct approach, we can use techniques that require wider frame buffer, things that DirectX just doesn't support. "

    and :

    "... realistically, I think that DirectX 10 is the last DirectX graphics API that is truly relevant to developers. In the future, developers will tend to write their own renderers that will use both the CPU and the GPU - using graphics processor programming language rather than DirectX. I think we're going to get there pretty quickly"

    From Part 4: "The result will be a reduction of our dependence on bloated middleware that slows things down, shielding the real functionality of the devices."

    from here: http://www.tgdaily.com/business-and...directx-10-is-the-last-relevant-graphics-api”
     
    Last edited: 16 Mar 2011
  5. Er-El

    Er-El Member

    Joined:
    31 May 2008
    Posts:
    487
    Likes Received:
    10
    With Xbox 360, developers are also forced to use the API for future compatibility.
     
  6. Grimloon

    Grimloon New Member

    Joined:
    4 Sep 2008
    Posts:
    885
    Likes Received:
    30
    I'm no developer but didn't DirectX up to and including 9.0c permit a certain amount of low level coding to the hardware via the HAL (hardware abstraction layer), in effect giving a slimmed down API for those that wanted to use it? Calls could be sent to the hardware via the HAL rather than having to go through DirectX, this being how EAX worked for one thing. I also seem to recall additional graphical options in some game engines dependant on what card you had.

    If I'm remembering this right then surely it isn't a case of getting rid of DirectX but providing an alternative to it and coding to the hardware via a slightly more direct route. If so then it is DirectX 10 upwards as well as Windows Vista and 7 that are the problem as they did away with the HAL and it is a more fundamental issue at OS level rather than just the API used for graphics and sound.

    Of course, I could have the completely bassackwards as I'd have a few whacks to the head since everything changed.
     
  7. Toploaded

    Toploaded New Member

    Joined:
    28 Mar 2010
    Posts:
    371
    Likes Received:
    6
    Kinda makes you wonder if Apple will jump on this to promote more Mac gaming into the future, with their hardware being more conformed then that of the PC market it's actually feasible it could be done (especially if they start conforming it tighter as from now)

    Maybe pressure like this for AMD and comments like the one from above from Epic will push MS to improving the efficiency potential of DX.
     
  8. Skiddywinks

    Skiddywinks Member

    Joined:
    10 Aug 2008
    Posts:
    930
    Likes Received:
    8
    What a load of ********. Seriously.

    I definitely see the plus side of getting rid of all APIs and coding direct to metal, but does this AMD guy even live in this world? Does he even see the current state of gaming? Developers can't even be arsed using the API to achieve greater things in PC games, because the consoles are just so much easier. Now he expects them to code for every possible graphics card out there? Is he nuts?

    I'm all for anything that gets me more value for money out of my hardware, but I really don't see this happening any time soon, at least not for major game releases where a dev is expected to have the game working on every possible configuration.
     
  9. vampalan

    vampalan New Member

    Joined:
    18 Jun 2003
    Posts:
    225
    Likes Received:
    1
    .. so these games being written to specific chipsets or something like that, so, upgrading breaks the games. ermm... fail.
     
  10. Ergath

    Ergath Giant Zombie Pigeon Photographer

    Joined:
    6 May 2009
    Posts:
    145
    Likes Received:
    2
    Great article - most enlightening. Raises a number of interesting issues, none of which I've time to type about now :(
     
  11. Th3Maverick

    Th3Maverick New Member

    Joined:
    23 Aug 2006
    Posts:
    165
    Likes Received:
    0
    Fixed.

    Only way I can see something like this working is some stardardised instruction set for GPUs. I totally see that happening.
     
  12. maverik-sg1

    maverik-sg1 Member

    Joined:
    18 Aug 2010
    Posts:
    368
    Likes Received:
    1
    It's not so much the API is bad, it's more over the fact that the base set-up for most games has to go toward the lowest common tech of the console it's to be ported to/from.

    So if the priority is to get the console(s) version launched first with PC to follow the best the PC can hope for is for the devs to take the time to add (and optimise) the extra code required to suit the capabilities of the current PC GPU's. So at best we see slightly better eye-candy, rather than seeing the PC capabilities utilised to provide a much more immersive environment - it's all about money and return on investment, console sales are far greater than those of the PC and doing more for a minority market (compared to the consoles) costs more and that hits the profitability of the product.

    Watch out during 2014 (when ps4 and next gen xbox come out) because this is the time I predict we will see the biggest jump in PC eye candy (porting between console and PC at identical hardware levels).

    Sadly the consoles will be offering similar levels of performance too - then watch over the next 4-8yrs as we see a repeat of where we are now - where PC games are dumbed down to accomodate the aging console technology parts.

    Direct to metal for the PC wont happen - for all the reasons that has been stated above. I am guessing that direct to metal for consoles was not how it was to start off with, it's how it is now because the consoles no longer have the power to carry an api layer and keep games looking/running better than 4 yrs ago.
     
  13. Zurechial

    Zurechial Elitist

    Joined:
    21 Mar 2007
    Posts:
    2,045
    Likes Received:
    99
    This makes me think of the days when PC games actually offered different renderers for different APIs and cards depending on what was out at the time.

    Older iterations of the Unreal engine allowed/forced the user to choose a renderer, from Software, Direct3D, Glide, S3, etc. and the differences in performance and graphical features were quite significant between those choices.
    Some renderers supported reflections and transparent particles, some only supported flat texturing and stipple-alpha transparency, for instance.

    Going further back still I'm reminded of games being released in specific hardware versions; such as PowerVR and 3DFX releases of MechWarrior 2.
    It was a strange situation and a fragmented market; but PC gamers tended to be nerdier and more informed in those days - Most modern PC gamers aren't quite as far removed from their gormless console counterparts as they'd like to think. Most wouldn't know their Sound Card IRQ from their DMA, as others pointed out above.

    I would love to see a return to those days if we can see a benefit in performance and innovation, but then I'm a nostalgic elitist when it comes to PC gaming and I don't think the modern market would handle it very well in practice when people are so quick to complain about the slightest instability in PC gaming in comparison to the supposedly greener pastures of console gaming.
     
  14. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,389
    Likes Received:
    63
    I think the point is that since unified architectures have been around, it's becoming feasible, not just possible.
     
  15. delsinboy

    delsinboy intermediate selfbuilder

    Joined:
    25 Feb 2007
    Posts:
    89
    Likes Received:
    0
    thanks.

    i've been wandering for a while now why pc games only look 10-50% better than their console counterparts when they should look in a completely different league.

    this issue is killing pc gaming.
     
  16. walle

    walle Well-Known Member

    Joined:
    5 Jul 2006
    Posts:
    1,750
    Likes Received:
    57
    From what I can gather game publishers and game developers wish to keep the brain-dead console crowd and phase computers out of the equation.

    I'd say it's greed more than anything else; they want to make a quick buck and a neat net profit for minimum amount of work. That's it.

    The PC platform is standing in their way and so they seek to remove it, but for that to work then need an excuse, and this is apparently their first run, out of many.


    I'm sure.
     
  17. Omnituens

    Omnituens New Member

    Joined:
    5 Apr 2006
    Posts:
    954
    Likes Received:
    11
    We need the people at AMD and nvidia to get together and just form 1 set of graphics cards.

    ALL HAIL RADEON 7990 GTX Ti!
     
  18. Ayrto

    Ayrto New Member

    Joined:
    20 Jun 2007
    Posts:
    255
    Likes Received:
    3
    MS seems to have everything in a headlock. Makes you wonder what the PCGA been doing on this really.

    On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call. .

    That's terrible

    There are the multi-threaded display lists, which come up in DirectX 11 – that helps, but unsurprisingly it only gives you a factor of two at the very best, from what we've seen.

    This just makes the case for getting rid. So basically what they're saying is if you buy a top end video card it's like buying a ferrari and never taking it near anything like a motorway or track.
     
    Last edited: 16 Mar 2011
  19. sWW

    sWW Member

    Joined:
    3 Apr 2010
    Posts:
    173
    Likes Received:
    3
    What is the reason for consoles being able to draw an order of magnitude more "chunks of geometry" per frame? I thought the Xbox used DirectX, so surely it has the same limitations as a PC but with an aging graphics chip?
     
  20. Ayrto

    Ayrto New Member

    Joined:
    20 Jun 2007
    Posts:
    255
    Likes Received:
    3
    When a console is first launched, you'll want an API so that you can develop good-looking and stable games quickly, but it makes sense to go direct-to-metal towards the end of the console's life, when you're looking to squeeze out as much performance as possible.


    This pretty much explains the how.
     
Tags: Add Tags

Share This Page