1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Farewell to DirectX?

Discussion in 'Article Discussion' started by arcticstoat, 16 Mar 2011.

  1. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. New Member

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    I find this article to be CRAP!! Just like with the PS3 the PC has two types of developers.

    1. This developer does what ever they can to make their product push the hardware and bring forth innovation and quality into their games. Crytek for example pushed for a higher level of quality and graphics with Crysis. Metro 2033 step it up and pushed the PC more than Crysis and now DICE has blown everyone away with BF3. All using DirectX and showing how the consoles are 3 generations behind as still in 2011 no console game comes close to Crysis. on the console side we have PS3 exclusives that blow all console titles out of the water and can only be compared to PC titles becuase of their quality.

    2. just makes a game the easiest way possible then whines, complains, and makes excuses about everything else holding it back. The PS3 has horrible looking muli-plats while the PC gets direct console ports that barely look better than the console version. WHY? because the developer chose the cheap-low road instead of QUALITY and INNOVATION.

    I'm so sick and tired of 3rd string developers ruining the industry by being LAZY and complaining instead of putting forth the effort to make a quality product. I always fall back on the car comparison some developers make your cheap every day car while other developers make Ferrari's and Bently's Never ever blame the tools-blame the person using the tools.
     
  2. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    Riiiiight. So if its earmarked then it isn't available to be spent on doing what Huddy suggests. Glad we agree on that. So, like I said, Apple is in a perfect position to invest heavily in pushing forward the open standards ...
     
  3. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    which they wont since they dont need to.

    so again the point of your post was?
     
  4. Deders

    Deders New Member

    Joined:
    14 Nov 2010
    Posts:
    4,048
    Likes Received:
    106
    Even if they did, do you think they would make it compatible with anything else?
     
  5. ssj12

    ssj12 Member

    Joined:
    12 Sep 2007
    Posts:
    686
    Likes Received:
    1
    idk, if OpenGL 4.2 has some of the requested features it would easily catch up to DX. And why play second fiddle to DX when they should be focusing on making their own unique API.
     
  6. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    They will never own OpenGL. I don't think that would be possible.

    Thinking more about it, it makes perfect sense for Apple to throw cash at OpenGL or Huddys idea. Imagine if they could adopt the tagline "Best played on a Mac".
     
  7. iwod

    iwod New Member

    Joined:
    22 Jul 2007
    Posts:
    86
    Likes Received:
    0
    I think we are missing a few things. Even if we go back to Direct to Metal, we wont need to select the Gfx by hand, i mean God we are in 2011 now. Auto Selection.
    The situation is very different then, you have near a dozen of Gfx company, each doing things differently, have mostly different features set. Now? We have Intel Crappy Gfx, AMD ATI and Nvidia, that is 99.9% market shares of All PCs plus consoles. Like John Carmack have said, doing optimization with 2 - 3 Gfx isn't difficult at all, he could properly do with 2 more Gfx Players. Even if you added Mobile Gfx, the fastest growing segment market, there is PowerVR, Qualcomm and ARM Mali. That is 6 Gfx Company that cover literally 99.9% of all Graphics Chip company on the planet.
    Nvidia have stated they now have double the software engineers working on drivers then hardware engineers designing the Hardware. And Forceware have more line of code then Windows 2000. Drivers Development are costly and at some point it will make economical sense to open up direct to metal path instead of Nvidia working by themself.
    And Drivers , Direct X , are all part of the Middleware. Direct X are suppose to make Developers life easier, and getting their work done much faster. Drivers is needed to translate those Direct X code into respective GPU code.
    However games has grown in complexities these days most developers dont write there own rendering engine or gfx part. They use another Middleware, like Unreal Engine, Game Byro, and other Engines developed in house for used in many titles. And Direct X like API dont matter much to the point because these engines are written to work with devices like PS3 which does not even uses Direct X. We are literally moving the Middleman from drivers , API to Middleware Game Engines. If ATI and Nvidia provide Low Level library support to Unreal developers Epic, you would properly see some very large performance improvement in games.
     
  8. Dr_s99

    Dr_s99 New Member

    Joined:
    11 May 2009
    Posts:
    14
    Likes Received:
    0
    hmmm, from the article it doesn't sound like his really talking about killing DX... since Xbox 360 is using DirectX :p.

    Major difference between a Console and PC has to do with the number of process is running in the background and how the CPU is shared between the game and other programs (of course assuming the graphics is the same on both system).

    Consoles tend to run lower number of process then a PC and usually they don't need to share the time between them (or not as much as PC's)...!

    To be honest even if MS used OpenGL the issue is going away :p
     
  9. Bitecore

    Bitecore New Member

    Joined:
    19 Mar 2011
    Posts:
    1
    Likes Received:
    0
    Major developments have strict financial plans which keep them from going fully away from API development. With direct-to-metal development the expenditures would be tremendously high due to the large amount of code, optimization and foremost compatibility work.

    Casuality affects as well. A lot of casual PC gamers won't move a step in order to overcome certain issues which may occure with their recently brought PC titles. Instead they'll just throw 'em away and tell their friends not to buy 'em at all saying "It is glitchy and s*cks as hell". They want perfect final product withouth any affects. Just buy and play. And I must say, it's not like those good days, where there were pretty good games which consoles couldn't struggle with, where there were a lot of enthusiastic PC gamers...
    Would be me a developer I'd bet on consoles...yeah, money is the key to the future development :( .

    However there some entirely PC games which prove that wonders are possible to achieve even with these old naughty pal - DIRECTX API. Examples: Half-Life 2 (a wonderful game which was very good optimized for those generations of hardware, looked wonderful and had advanced havok physics. Lately was ported to XBOX (first one) with a lot of issues. Later Gabe stated it was a mistake. Nevertheless it was ported to the current generation of consoles again as an ORANGE BOX Edition (three years later after original Half-Life 2 came to play).

    Crysis..Wasn't ported at all, uses Cryengine2 (which is way better and powerful then Cryengine3 (Crysis2) because it wasn't meant for consoles. There is a well-known proof somewhere on youtube. You may take a look). Besides first Far Cry is a typical pure PC game as well...and it is good looking, and pushed game graphics towards a lot.
     
  10. leslie

    leslie Just me!

    Joined:
    19 May 2009
    Posts:
    412
    Likes Received:
    11

    And if they decide NOT to code for your card?
    Or how about one company pays them not to or to heavily optimize for theirs only, gee, I can't imagine that EVER happening... Sorry, but even now, companies do this and even though it's minimal, it still burns me up.

    As if that wasn't enough, I can just imagine how many fewer games we would get if each game had to have that much more code.


    Sorry, but I remember how gaming used to be, and even when auto-detect worked you still needed the compatibility written into the games. Even today, WITH good auto detection, how many games crash various hardware configurations? How many companies have claimed the latest bug was hardware specific? Now add them having to create current drivers for each card, I can only imagine the hell that would create these days.
     
  11. Ayrto

    Ayrto New Member

    Joined:
    20 Jun 2007
    Posts:
    255
    Likes Received:
    3
    Here's what DICE's repi @ Dice posted on B3D:


     
    Last edited: 19 Mar 2011
  12. ET3D

    ET3D New Member

    Joined:
    19 Mar 2011
    Posts:
    5
    Likes Received:
    0
    One thing I'd like to tell Richard Huddy: don't give ideas to competitors who can outdo you with one arm tied behind their back. AMD/ATI hasn't been able to get people to use any proprietary GPU technology it created. NVIDIA has done a great job of that. Someone at NVIDIA could be reading this and in a couple of years NVIDIA will have a 3D SDK which outperforms Microsoft's and has better tools as well as integration with other NVIDIA proprietary technologies (PhysX, CUDA). People will buy low end NVIDIA GPU's even alongside AMD Fusion CPU's because games for them will perform better even though AMD's integrated GPU's may be technically better.
     
  13. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    Ayrto, thank you for updating the thread with that.

    The odds of this coming to anything ... slim?

    But if it did come to something ...
     
  14. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Nice find, seems like Huddy's comments struck a chord with some game devs at least.
     
  15. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    Oh Please stop...my sides can't take it......HAHAHAHAHAHAHAHA!
     
  16. JOKe

    JOKe New Member

    Joined:
    20 Mar 2011
    Posts:
    1
    Likes Received:
    0
    The thing is that this days EVERYONE is using high level API like DirectX .... example I am not sure what they have used but check the Dragon Age 2 ... the game looks nice it works AWESOME on a PC and it SUCKS on XBOX ... the performance and the graphics on XBOX are CRAP ... on PC it is other thing.. so you still can make a lot better looking game for a PC with better graphics and better performance even when using a high level API of course if you dont use the performance can be 10 times bigger but the cost of developing such game will be .... 100 times bigger .
     
  17. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    Glad to see everything is not wasted on you :thumb:
     
  18. valmadasss

    valmadasss New Member

    Joined:
    21 Sep 2009
    Posts:
    6
    Likes Received:
    0
    Consoles are using their power for games' environments' quality (as good as they can)
    PCs are using their higher power for certain things...

    for example a GTX560Ti that is at least 6 times stronger tha PS3's RSX, in COD Black Ops....

    PS3 VS GTX560Ti
    Resolution: 600p - 1080p
    Graphics: medium - high (quality)
    AA: X2 - X8
    AF: (if any)X2 - X16
    FPS: ~40 - ~100

    just think how much less quality ps3's COD Black Ops would use to get around 100fps with 1080p, X8AA, X16AF.... just think how much more performance PCs could have with 600p, X2AA and X2AF and medium Graphics

    Thats why PCs don't look 10 times better....
     
  19. Sebbo

    Sebbo New Member

    Joined:
    28 May 2006
    Posts:
    200
    Likes Received:
    0
    Getting rid of an API like DirectX so that developers code direct to the hardware is more likely to make the games worse, IMO
    Look better, sure, because the developers can tap directly into the power of any GPU they wish. But as the article touches on, by doing this developers now have huge issues with QA and stability. So, having to code and optimise paths for each GPU you want your game to support, plus sorting out issues with as many configurations as you can test (which won't nearly be the majority of configurations that people will actually use to play the game) is going to mean far longer time in development, not just for the initial release, but patches and updates to support the configurations that have become available while the initial release or previous patch was being developed. Because of all this extra time (and subsequently money), publishers will take less risks, which leads us back to dull, samey games, and more focus on console titles

    TL;DR: Sure, games will look great and vastly different, but they're going to end up playing the same (more than they do now)
     
  20. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
Tags: Add Tags

Share This Page