1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware G80: NVIDIA GeForce 8800 GTX

Discussion in 'Article Discussion' started by Tim S, 8 Nov 2006.

  1. -EVRE-

    -EVRE- New Member

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    HDMI?
    GX2 SLI still faster?
    Water cooling from BFG Koolance safe? ie copper getting friendly with the alu. radiator.
    Any news/idea on what a 8800gt will be?

    p.s. I gota love my 7800's for a little longer. *big smile*
    and a 30" Dell on order for dec.1 (gota push my billing on it till Jan.07!!)
     
  2. WhiskeyAlpha

    WhiskeyAlpha New Member

    Joined:
    5 May 2006
    Posts:
    838
    Likes Received:
    4
    I agree with spec on this one.

    You just need to look at the bigger picture (pun fully intended :D)
     
  3. DeX

    DeX Mube Codder

    Joined:
    22 Jul 2002
    Posts:
    4,152
    Likes Received:
    3
    The ATI's AF looks more blurred to me. It's clearly a compromise between blurriness and sharpness with moire becoming more obvious with the sharp lines and detail lost with the blurred lines. I'd have to say the ATI and NVIDIA AF quality are pretty evenly matched. Perhaps there's not much more they can do to improve the quality even with the G80's level of power.

    Good review Tim. It's good to have an explanation of the changes to the architecture as it helps you understand the benchmarks a lot better. Of course if you're not interested you can always just skip to the benchmarks.

    It's obvious that this card will need bigger and better games and heavy use of shaders to really put it to the test. I can't wait to see what developers do with this baby's power.
     
  4. Sparrowhawk

    Sparrowhawk Wetsander

    Joined:
    14 Feb 2004
    Posts:
    584
    Likes Received:
    1
  5. Techno-Dann

    Techno-Dann Disgruntled kumquat

    Joined:
    22 Jan 2005
    Posts:
    1,672
    Likes Received:
    27
    And will the retail version ship with an extra SLI bridge so two of them can be put in SLI, or will they just expect customers to find a second bridge somewhere?

    As far as I am concerned, it's too much money for too little improvement over the X1950 XTX, especially as I don't game any larger than 1280x1024, and don't plan on going widescreen. Besides, there's no way my current PSU and cooling could handle one of those monsters.

    It's a record-smashing peice of hardware, but I'd rather wait, thanks.
     
  6. Grinch123456

    Grinch123456 New Member

    Joined:
    9 Aug 2006
    Posts:
    99
    Likes Received:
    1
    Ouch! My life savings! <sarcasm>Time to drop out of college!</sarcasm>
     
  7. wharrad

    wharrad New Member

    Joined:
    26 Jul 2003
    Posts:
    869
    Likes Received:
    0
    That's crazy fast. Presumably over time the need for a large monitor to release the full potential will drop as game engines get more complex. Personally going to wait for a DX10 part from ATI - not due to fanboy love... but you've got to wonder how much of the high price is due to it being up there on it's own.


    As for the wattage... well yeah. But it's cold this time of year and I was going to pay to heat the house anyway... :)
     
  8. r4tch3t

    r4tch3t hmmmm....

    Joined:
    17 Aug 2005
    Posts:
    3,166
    Likes Received:
    48
    Heh I do the exact opposite usually. Don't really mind about benchmarks too much unless I want to compare a possible purchase.
    Excellent review as always :thumb:
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    We measured total power consumption of the system, since it's the easiest thing to do. :)
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    yes, it's very similar to what ATI does with Hierarchical Z already. :)
     
  11. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    7,112
    Likes Received:
    788
  12. Mother-Goose

    Mother-Goose 5 o'clock somewhere

    Joined:
    22 Jul 2004
    Posts:
    3,890
    Likes Received:
    6
    Thats the badgers nazzers Tim! AND I now understand what the unified stuff is all about! Well written, good facts! I am very intrigued to see ATI's response now as I think it is amazing that the X1950XT was actually a patch on it ( i know, full res on a 30" it isnt but everything else it was alot closer than I expected).

    Awesome Tim, just awesome. I forwarded it around the office. No one seems to care although someone asked when it will be in a laptop.....I've just stolen his lunch.
     
  13. Mother-Goose

    Mother-Goose 5 o'clock somewhere

    Joined:
    22 Jul 2004
    Posts:
    3,890
    Likes Received:
    6
    Probably for you then what you have is more than enough, and for most people in all honesty. BUT at big res's you can still have everything on full. the X1950TXT can't do that :)
     
  14. keir

    keir S p i t F i r e

    Joined:
    5 Oct 2003
    Posts:
    4,377
    Likes Received:
    48
    I'd like to see that in a mobo, its mahoosive!
     
  15. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Sorry, I forgot to come back to it - I've edited that bit accordingly. :worried:
     
  16. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    We're talking about NVIDIA's default AF quality on GeForce 8800 GTX. ATI's default AF quality is much worse than NVIDIA's GeForce 8800 GTX AF quality - it's closer to GeForce 7950 GX2 AF quality. With High Quality AF, it's about the same - some instances look better on NVIDIA hardware, and some look better on ATI hardware.

    I wanted to make some videos but unfortunately ran out of time after spending ~30 hours writing the article. :)

    I'm interested to see how High Quality driver mode affects things on GeForce 8800 GTX - I'm probably going to look into that one soon.
     
  17. Mother-Goose

    Mother-Goose 5 o'clock somewhere

    Joined:
    22 Jul 2004
    Posts:
    3,890
    Likes Received:
    6
    So there are different levels of anti-aliasing? I'm confused I just thought there was the different multiples (2x, 4x etc) but there are different grades? (normal, high etc)
     
  18. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    I'm going to be doing a much closer investigation into AA quality very soon.

    In a nutshell though, there's 2x 4x 8xQ (these are all full multi-sample AA patterns) and there's 8x and 16x which use the same colour/Z samples as the standard 4xAA but with 8/16 coverage samples meaning that you get higher AA along the edges of polygons. Then there is the 16xQ mode which uses the same colour/Z samples as 8xQ but 16 coverage samples meaning that you get higher quality edges on rigid objects.

    The difference between 8x/16x and 8xQ/16xQ is that there are more colour/Z samples, meaning that things like shadow edges are sampled better. With 16xQAA, you'll get 16xAA on the edges of objects, and 8xAA on things like shadows and with 16xAA, you'll get 16xAA on the edges of objects and 4xAA on shadows, etc. :)
     
  19. flabber

    flabber New Member

    Joined:
    10 Jan 2005
    Posts:
    122
    Likes Received:
    0
    Uhm, how can it be tóó fast for monitors with a lower resolution (like the 1680x1050)?
    With a game like Crysis, wouldn't it be perftect to run it with this card, éven if you "only" have a 20" monitor?

    My XFX 7900gt runs like the wind, but I could definately use some more speed. Since I don't want/need Sli (too expensive, and too powerhungry for me), a card similar to this one would be perfect right?
     
  20. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Of course it would... I guess the question you have to really ask is whether you could do with more speed... then you've got to ask yourself "how much more speed". Without doing any testing yet, I'd say the GTS is a better buy for those with 20" panels.
     
Tags: Add Tags

Share This Page