1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Bits Interview: TWIMTBP, DX10 and beyond

Discussion in 'Article Discussion' started by Tim S, 26 Jun 2007.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
  2. Ramble

    Ramble Ginger Nut

    Joined:
    5 Dec 2005
    Posts:
    5,596
    Likes Received:
    43
    Sounds like a rather exciting time to be a PC gamer.
    What a lucky *******.
     
  3. Hells_Bliss

    Hells_Bliss What's a Dremel?

    Joined:
    6 Apr 2007
    Posts:
    548
    Likes Received:
    0
    [drool] 9million pixels [/drool]

    nice article! I guess i'm going to have to buy vista sooner than I thought.
     
  4. trig

    trig god's little mistake

    Joined:
    10 Aug 2006
    Posts:
    2,853
    Likes Received:
    44
    so basically tying in to the dismal performance of the 8800 gts on the dx10 version of CoH, he is saying that gameplay will be less affected fps-wise on a game built for dx10 from the ground up?
     
  5. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I think that's basically what he is saying yes.

    Based on what I've seen of CoH and some of the other 'early' DX10 games, porting DX9 games to DX10 looks more like a case of ticking check box features, than actually providing much benefit to the end users. The real benefit will come where developers have decided very early on (relatively speaking - these games have all been in development for years) in the development cycle that they're going to jump to DX10 and create a version that takes advantage of a hell of a lot of the DX10 features.

    There are some things that I unfortunately cannot discuss at the moment, but the reason outlined above looks like it is why GPG opted not to port SupCom to DX10. This isn't me having a go at developers that have gone down the porting route as I applaud them for trying to push boundaries.

    I think there is also a bit of driver optimisation in the mix too (how much, I don't know), but Nvidia isn't really going to say that its DX10 drivers aren't performing at maximum potential. AMD might say that its drivers aren't performing as well as they could be, but that's because the hardware is still very new.
     
  6. Redbeaver

    Redbeaver The Other Red Meat

    Joined:
    15 Feb 2006
    Posts:
    2,062
    Likes Received:
    36
    "there's some life left in PC games"???

    OF COURSE THERE IS!! :D

    i want crysis so bad i can taste it...
     
  7. Spaceraver

    Spaceraver Ultralurker

    Joined:
    19 Jan 2006
    Posts:
    1,363
    Likes Received:
    5
    Did you ask about power consumption??
     
  8. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    loved the buildup of the article towards the "crysis". lol almost made me cry of joy ;)
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I think Roy was actually worried that I wasn't going to ask about Crysis and you could definitely tell from the tone of his voice that he is really excited about both Crysis and World in Conflict. I was talking with the Nvidia PR representative that was also on the call afterwards and he said exactly the same to me too.
     
  10. Guest-23315

    Guest-23315 Guest

    He seems like the kind of guy that cares about a game and how well it plays just as much as he cares about selling his hardware.

    Then again he does basically get free GPU's.
     
  11. trig

    trig god's little mistake

    Joined:
    10 Aug 2006
    Posts:
    2,853
    Likes Received:
    44
    well good. thanks tim...now if i could just sell my pc so i can afford to upgrade to dx10, ill be crysis ready...
     
  12. devdevil85

    devdevil85 What's a Dremel?

    Joined:
    29 Nov 2006
    Posts:
    924
    Likes Received:
    0
    Crysis!, Crysis!, Crysis!
    Those graphics are seriously sick! I can't wait to get a new rig that can play that! especially WiC.....
     
  13. pendragon

    pendragon I pickle they

    Joined:
    14 May 2004
    Posts:
    717
    Likes Received:
    0
    thanks for the article, folks! :thumb: good read
     
  14. Woodstock

    Woodstock So Say We All

    Joined:
    10 Sep 2006
    Posts:
    1,783
    Likes Received:
    2
    kidney for sale. offers
     
  15. samkiller42

    samkiller42 For i AM Cheesecake!!

    Joined:
    25 Apr 2006
    Posts:
    6,796
    Likes Received:
    538
    Cheers Tim and Roy.
    Crysis demo soon :thumb:
    I think im going to have to break my game demo rule, which is not to play the demo of a game that im eager to play, mind you i broke it for C&C 3 :clap:

    Sam
     
  16. EQC

    EQC What's a Dremel?

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0
    I'm going to ramble on a bit below...but for the short version of my point: I am wondering how long we'll be waiting for a 3840x2400 gaming-capable monitor to be readily available (for, say, the ~$2500 price that consumer 30" monitors came out at)?? I'm guessing that it'll be 2011 at least before we see such a monitor, primarily because of the cabling issue. Further explanations below...

    -- -- -- -- --

    Hmm...rather than wonder about the size of the monitor, I'm wondering about the cable to support this resolution -- I believe single link DVI officially maxes out at 1600x1200x60Hz, and dual-link DVI can only go up to 2560x1600x60Hz. I'm guessing there's another technology around the corner, but I'm also guessing we must be several years away from actually using it to such high resolutions. This last guess is based on how long it took us to switch from VGA to DVI, and then to eventually have DVI monitors capable of higher resolutions than VGA's max (ie: above 2048x1536)

    When DVI first came out, it was single-link only (IIRC), and that resolution is actually SMALLER than what can be passed through a VGA cable (2048x1536). At the time, IIRC, LCD's were available at 20" 1600x1200 maximum -- so, putting a VGA port and a DVI port on a monitor meant either cable could be used to support the monitor's full resolution. Later, we saw widescreen 1920x1200, which is fine for VGA, and I think some single-link DVI video cards support it too. Even later than that, we saw the debut of 30" 2560x1600 monitors -- these can only be accessed by dual-link DVI (and don't generally come with any other cable connections). So, I'm thinking it was at least 3-4 years between when DVI first became visible on the market, and when a consumer-grade monitor came out that finally reached beyond the limits of VGA. And even today, 5 years or so after I remember first hearing about DVI, the $1200+ 30" monitors are well beyond what most people would be willing to spend...

    So, if some new cable capable of supporting 3840x2400 isn't even on the market yet...how long will it be before we see a monitor in this resolution? Are we going to go through another phase of dual-cables, with monitors still maxing out at 2560x1600 but featuring a dual-link DVI cable in combination with some new technology? How long will that phase last before we see the first monitor that gets rid of DVI and goes beyond 2560x1600? How long before we move from there to the much greater 3840x2400 -- that has more than twice the pixel count!!! And finally, how long after that before we see 3840x2400 monitors becoming "affordable" and "common" among gamers?

    -- -- -- -- --

    One last note: 3840x2400 is sortof an "ideal" resolution for HDTV's (or, really, for an HDTV, we'd leave thin black bars at the top and bottom and use 3840 x 2160 pixels) -- it allows perfect integer scaling of 1920x1080p/i content using a factor of 2, and 1280x720p content using a factor of 3. So you wouldn't need a "great" video scaler, just a "decent" one to provide a good picture in either case. Even if the scaler just "blew up" the picture without any interpolation, you'd still have a true, large-pixel version of 1080 or 720 video (much simpler than those strange 1366x768 HDTV's, and also simpler than scaling 720 lines up to 1080 (factor of 1.5) or scaling 1080 down to 720).
     
  17. mardukph

    mardukph What's a Dremel?

    Joined:
    26 Jan 2007
    Posts:
    16
    Likes Received:
    0
    nice read, keep it up
    -------------------------------------------------------------------------------
    hmmm 3840x2400 = 4 (2x2) 1920x1200 monitor setup on the interim before having single panel monitors (and affordable) at that insane res =D
     
  18. mattthegamer463

    mattthegamer463 What's a Dremel?

    Joined:
    26 Nov 2004
    Posts:
    2,804
    Likes Received:
    1
    Will you be drooling when you get 2fps in any game at that resolution?

    I'm worried that all this stuff is being developed almost exclusively for people with $4000+ rigs, and not for even people like me, who dropped a more than modest $2500 on mine just 3 months ago. All I want is good FPS at medium-high settings. Thats all I ask. Is it too much? Thank god I'm only on a 17' LCD for the time being, anything bigger would get terrible FPS at high settings.
     
  19. Joeymac

    Joeymac What's a Dremel?

    Joined:
    3 Oct 2006
    Posts:
    243
    Likes Received:
    0
    HDMI 1.3 can do 1440p and 48bit deep colour in a single link. So a dual link (Type B connector, wider than the usual ones) would have no problem with these resolutions and still be able to carry 8 channels of audio. They aren't much in the wild yet but are compatible with dual link DVI so if someone started adding HDMI 1.3 (type b) output to GPU cards, with a converter, there wouldn't be a problem running existing monitors off them... even the 30inchers.
    The competing "cable standard" to HDMI is
    "Display Port". Which Dell has shown off on some displays. That is also capable of these resolutions and can do daisy chaining of displays. But it has zero backwards compatibility with DVI monitors or HDMI tv sets whilst not really offering higher data rates than HDMI 1.3 (plus it has it's own copy protection in addition to HDCP)... so there is not much point in it. As you mentioned a PC would need to carry both technologies for output... that's expensive etc. So hopefully it will die.
    Wide HDMI cables might be the quickest way forward.
     
    Last edited: 27 Jun 2007
  20. Hells_Bliss

    Hells_Bliss What's a Dremel?

    Joined:
    6 Apr 2007
    Posts:
    548
    Likes Received:
    0
    well a lot of this technology won't be out for at least a year, so yes they might be focused at the 4k+ market at the moment, but prices drop quickly. Everything is getting bigger for cheaper, so I dont see why a 30" screen, when it is mass produced and with market saturation with higher screen sizes, will be extremely expensive compared to now.

    also, it'll get better than 2fps, they wouldn't sell it if it was that poor.
     
Tags: Add Tags

Share This Page