News ATI lays out DX10

Discussion in 'Article Discussion' started by WilHarris, 30 May 2006.

  1. WilHarris

    WilHarris Just another nobody Moderator

    Joined:
    16 Jun 2001
    Posts:
    2,679
    Likes Received:
    2
  2. TheEclypse

    TheEclypse What's a Dremel?

    Joined:
    11 Aug 2003
    Posts:
    407
    Likes Received:
    1
    Im all for the faster cards bit, but not hotter - the X1800XL I have in my shuttle keeps spinning up even when the computer is just sitting there doing nothing.
     
  3. Fr4nk

    Fr4nk Tyrannosaurus Alan !

    Joined:
    12 Mar 2005
    Posts:
    2,367
    Likes Received:
    2
    I couldn't careless about the heat and noise really, as long as the chip can withstand that heat, then it will be fine. All this is very interesting, I like the idea of just having "shaders" and giving them differant commands/processes.
    This could spell dissaster for the Un-offical "Omega" drivers :(

    -Fr4nk
     
  4. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,934
    Likes Received:
    727
    I hope that quote is wrong, Microsoft is not one for innovation whereas the ability for GPU vendors to outdo each other on feature set pushes the graphics market and games forward.
     
  5. Guest-16

    Guest-16 Guest

    No, it wants it to say "direct X 10" (or WGF 2) and the customer to know exactly what they are getting, not HDR, no AA, HDR+AA, unified, non unified etcetc

    Innovation will come in the form of cooler, more efficient gpus Id imagine, versus extra features, however the direct x's always get a few revisions when a new number hits. It'll be better for the consumer (upgrade cycles are currently insane) and for the companies which make games.
     
  6. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,785
    Likes Received:
    103
    I think this kind of puts nVidia behind the 8-ball. The new standard fits nicely in with Ati's existing designs (their x-Box 360 work probably influenced the standard) and we all know that Quad GPU is an evolutionary dead-end, though a cool one. It seems to me that some of the heat issues will be offset by the next move to smaller architecture.

    The bit about MS demanding drivers be WHQL certified is interesting given that Dx10 is a Vista feature and the beta testers have been whining about massive lack of driver support in Vista. Perhaps MS is trying to get the hardware manufactures to do their job for them, as usual.
     
  7. yahooadam

    yahooadam <span style="color:#f00;font-weight:bold">Ultra cs

    Joined:
    21 Mar 2006
    Posts:
    1,323
    Likes Received:
    0
    A single shader for the whole lot is a fantastic idea

    BUT WHY HOTTER

    the current cards already burn their way through current heat sinks

    WHY OH WHY aren't they doing what AMD and Intel are doing, improving efficiency and lowering heat output

    This is getting ridiculous

    my x1900XT is already ridiculous, the ATI "squirrel" fans are already really really loud

    and it will only get worse with a hotter GPU
     
  8. Cobalt

    Cobalt What's a Dremel?

    Joined:
    24 Feb 2006
    Posts:
    309
    Likes Received:
    2
    The funny thing about the fans on the ATi cards is that if you take one out and just wire it up seperatly then its not really very loud. Add in the restriction of the HS and the noise goes through the roof!
     
  9. Lazarus Dark

    Lazarus Dark Minimodder

    Joined:
    14 Apr 2006
    Posts:
    360
    Likes Received:
    0
    I was going to go phase change on my cpu when/if ocz comes out with thier low cost units. but next gen cpus wont need it, so instead I will use the phase change on my dx10 card :p
     
  10. yahooadam

    yahooadam <span style="color:#f00;font-weight:bold">Ultra cs

    Joined:
    21 Mar 2006
    Posts:
    1,323
    Likes Received:
    0
    man, it really sucks

    shame that a 7900GTX was £50 more and performs worse then the x1900XT
     
  11. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    hotter and more power hungry. OMFG!!!! they must be joking, the present technology of graphics cards are enough to boil water and eggs in it and consume more power than your entire house combined. just friking make it more like the new intel's, cooler and more eficient and a heck of a lot faster.
     
  12. zoot2boot

    zoot2boot What's a Dremel?

    Joined:
    9 May 2006
    Posts:
    75
    Likes Received:
    0
    more efficent but hotter ;) i like the sound of that... sounds like the next gen wont just be an incrimental improvement the way the last three or four have been. personally the 'feature' of low power consumption/noise reads like 'blah blah blah, excuse for not pusing the envelope in the intersts of cheaper smaller chips to increase profit margins'. if the noise sucks on my x1900xtx in the mail then i'll just stick a fairly cheap zalman heatsink on it which is near silent and increases the OC potential. thing about stock heatsinks is they're made to a budget.
     
  13. yahooadam

    yahooadam <span style="color:#f00;font-weight:bold">Ultra cs

    Joined:
    21 Mar 2006
    Posts:
    1,323
    Likes Received:
    0
    well thats another £40 or whatever on your allready £350+ gfx card

    your starting to talk about serious money

    why should you have to replace the stock HSF, the Nvidia ones works very well now

    Also, im not sure how much room zalman gives for ocing

    and dont forget, those zalman ones dont cool the memory
     
  14. zoot2boot

    zoot2boot What's a Dremel?

    Joined:
    9 May 2006
    Posts:
    75
    Likes Received:
    0
    but the nvidia design philsophy is short sighted in my books. they said it themselves with the 7900GTX launch: paraphrasing... just do what is needed now for current games. that's why their chips are smaller and cooler. personally i think that as time goes by the performance delta between the x1900xt and the 7900gtx is only going to increase because of this.

    last time i bought a zalman heatsink it definately increased to oc potential of my x800xt and came with ram sinks which the fan blows over where the stock heatsink doesn't.

    spending a little more after the fact doesn't bother me. where i am now for the price difference between an x1900xtx and a 7900gtx you can pick up a zalman and have a quiter faster card for the same price.
     
  15. eddtox

    eddtox Homo Interneticus

    Joined:
    7 Jan 2006
    Posts:
    1,296
    Likes Received:
    15
    I'm sorry, I was under the impression that in order to increase efficiency the power input and the heat output have to decrease. Physiscs 101: efficiency (%)=total input energy/useful output energy*100. Also, by making their cards run hotter, aren't they lowering OC potential. I remember people were using AMD Athlon laptop processors in desktop PCs because they ran cooler and therefore could be OCd more. As for the WHQL certified fiasco, I think it's a bad idea as curently many smaller hardware manufacturers which produce budget equipment don't have WHQL certs. If these companies had to get WHQL certs for all their products it would drive their prices up. Yet another money-grabbing scheme by M$ (surprise surprise). IMHO

    - ed out
     
  16. Hamish

    Hamish What's a Dremel?

    Joined:
    25 Nov 2002
    Posts:
    3,649
    Likes Received:
    4
    lol you people are talking like they're designing their new chips to be hot
    they're designing them to be efficient so that they can then push them further, they end up being hot but insanely fast
    nothing stopping them releasing a downclocked one for laptops or what not that will run cool...
     
  17. dullonien

    dullonien Master of the unfinished.

    Joined:
    22 Dec 2005
    Posts:
    1,282
    Likes Received:
    29
    Everyone here would replace a stock Intel/AMD heatsink & fan on their processors, hell how long has it been since any of you ran a stock one atall, proberly didn't leave the box like mine. All they need to do is make it work, they don't care about sound or being able to overclock.

    The cooler on my 6800GS lasted for one day before i ordered a zalman one, money well spent in my opinion.

    Of course they cool the memory, the fan covers almost the whole card and is supplied with the required heatsinks to take advantage of that.
     
  18. Iago

    Iago What's a Dremel?

    Joined:
    4 Oct 2005
    Posts:
    202
    Likes Received:
    0
    Perhaps, but if you look at the market those parts are geared to, you'll see that by the time there's a significant performance delta, most high-end users will have already upgraded to DX10 parts or whatever that's kicking by then.

    If enthusiast used to upgrade in 12-18 month cycles, ATI's strategy would be more reasonable, but with the current rythm of upgrades, nVidia may do better. "Do what is needed for current games, cause when next-gen games arrive, most high-end users will upgrade anyway"...
     
  19. zoot2boot

    zoot2boot What's a Dremel?

    Joined:
    9 May 2006
    Posts:
    75
    Likes Received:
    0
    efficiency (%)=total input energy/useful output energy *100

    if you increase the amount of work done per unit of input energy you increase the efficency. if the chip ends up hotter too that means that means you're doing more work per unit energy but also putting in more energy = lots more work done.

    yeah, the nvidia approach is fine but it's not the kinda thinking that's going to advance the gfx industry at the same rate the ATI approach will. ATI are already making unified pipes, building experience and tech whereas nvidia are going to be n00bs again when dx10 comes around. i predict fx style crapness again next year from nvidia.
     
Tags: Add Tags

Share This Page