1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Radeon HD 2600 XT vs. GeForce 8600 GT

Discussion in 'Article Discussion' started by Tim S, 14 Aug 2007.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    When you bought your new TV, did you buy an LCD or a Plasma? And if so, why did you buy it? I believe you bought the plasma because colours are more vivid and blacks are better. I think you'll see a similar impact with deep colour with support for "up to" 36-bit colour.

    The human eye is analogue and can therefore process an infinite colour pallete. Really speaking then, the question isn't how many colours can the human eye detect, it's how many colours can the brain detect. That, ultimately, depends on the individual.

    In this report, the reporter says he was unable to see any difference above 30-bit colour. While that may be the case for him, what's to say he isn't colour blind?

    The same can be said about the human ear. Can someone with imperfect hearing hear the noises at the same frequencies as someone with perfect, or near perfect hearing? No, of course it doesn't, but that doesn't mean there isn't sound (that people can hear) at the frequencies which the person with imperfect hearing cannot hear.
     
  2. Hugo

    Hugo Ex-TrustedReviews Staff

    Joined:
    25 Dec 2006
    Posts:
    1,384
    Likes Received:
    19
    Saying Deep Colour is pointless is like saying Lossless audio formats are pointless. I can tell the difference between FLAC and 320kbps MP3 and Rich would probably agree that the former sounds superior, but many would argue MP3 offers "all the detail you need."

    The difference between "standard" 24-bit and Deep Colour in terms of colour depth should be like changing from 16 to 32-bit for those of us who remember that transition (a lot I hope given the nature of the site). Ok, so you may not be able to tell at every given moment that your getting a slightly different shade of green to if you were watching a 24-bit source, but your overall perception will improve.

    Now I'm of to play counterstrike 1.3 at 800 x 600 on a 30in 2560 x 1600 monitor (because no-one can see the difference in dot-pitch anyway) with 16bit colour...
     
  3. Da Dego

    Da Dego Brett Thomas

    Joined:
    17 Aug 2004
    Posts:
    3,913
    Likes Received:
    1
    Actually, archangel, though your comment on 16 to 24+ may seem valid, the comparison is not. It comes down to statistical understandings and Z scores. Differences between 2 and 3 standard deviations, for example, are far less detectable than those between one and two. Or, in this case, it's more like between 2-2.5 and 2.5-3.

    As Tim mentioned, I did buy a plasma because the "colors are brighter." In that respect, he's dead on. However, a lot of that has to do with the brightness of each pixel independent of the next, and the ability to protect each against "bleed" from those neighbouring it.

    There are quite a few studies that (if anyone is truly bored) I can throw out here, as I did my homework before stating such bold things. However, I think maybe a look into digital colour and depth as a whole is in order, as too much discussion here is kind of crapping on this thread.

    Would people be interested in this? Because I'd be happy to do one.

    The way I read Tim's thoughts, the ideas of true vs. deep colour are little more than a side note for the generally poor performance of these cards, and what's important is that these cards don't include the possibility if you DO want it. That, on top of all of the other performance issues compared to both current and last generation hardware make these cards both good ones to avoid.
     
  4. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    Brett, I'd love an article on the subject. It's clear that it's an important aspect of video for the future one way or another so it'd be good to have something that cuts through the marketing crap and gets to the nuts and bolts of the situation.
     
  5. [cibyr]

    [cibyr] Sometimes posts here

    Joined:
    30 Nov 2003
    Posts:
    749
    Likes Received:
    1
    I thought ATi didn't include dedicated MSAA hardware, instead doing it with programmable shaders. Since DX10 requires the ability to do custom AA, ATi chose to leave off the MSAA hardware to make room for more shaders while nVidia went with less shaders but left the MSAA hardware. End result: nVidia kicking ATi's ass in almost everything with AA, except for Call of Juarez which uses custom AA.

    I could be wrong though, please correct me if I am :)
     
  6. Woodstock

    Woodstock So Say We All

    Joined:
    10 Sep 2006
    Posts:
    1,783
    Likes Received:
    2
    Any article you guys are prepared to write, im happy to read...
     
  7. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,652
    Likes Received:
    19
    Nope, you're spot on. In a number of ways ATI has been commendably forward thinking with its inclusion of a tesselation engine and with pure shader based MSAA. Unfortunately they haven't quite got the balance right and performance is simply not good enough, at least with current games. As developers move towards using features like tessellation and custom AA then ATIs current offerings should be better than nVidias. The only trouble being that by the time the games start to arrive, new cards from nVidia should be hitting the shelves.
     
  8. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Well, that's not been confirmed by AMD and the company still lists "Programmable MSAA Resolve" as a feature of the HD 2000-series ROP hardware. This alone suggests that there is MSAA resolve hardware available, but it's just not working.

    As stated above, AMD's briefings state that there is a programmable MSAA resolve unit in each ROP partition, which suggests that there is hardware-based MSAA resolve. However, the MSAA performance deficit suggests that it isn't working in every driver available at the moment.

    AFAIK, only the CFAA filters should use the shaders to enhance the anti-aliasing quality. The standard box filter shouldn't need shader-based MSAA resolve if there is a dedicated (and working) MSAA resolve unit in the ROPs. It's listed as a feature of R6xx's ROP hardware, so the only assumption one can make is that it isn't working.
     
  9. Renoir

    Renoir What's a Dremel?

    Joined:
    19 Jul 2006
    Posts:
    190
    Likes Received:
    0
    An article explaining all the audio implications of high def dvds and xp/vista would be much appreciated as the situation seems very confusing.
    Do you mean that the early cards physically didn't include the crypto rom or that nvidia didn't require it to active by the card maker. I ask because ISTR Rich mentioning a while back that HDCP costs money per implementation and not just for the keys themselves e.g. nvidia would pay to have the crypto rom in the chip but say asus would also have to pay a fee if they wanted to implement the feature. Could you shed some light on that?
     
  10. falconsport

    falconsport What's a Dremel?

    Joined:
    16 Aug 2007
    Posts:
    1
    Likes Received:
    0
    hello all,
    i've read the review and all of your post, and i'm just become more confused, i'm an average medical student, with average computer knowledge, and i want to buy new graphic card to replace my old 2 years-now dead-old 6600gt.

    i want a mid-performance graphic card which still can be used for at least 2 years (time where all is likely directX10 based). i will used it for things like works( write report,read ebook,etc),hear music, play videos, and games at medium-high quality at 1024 or 1280, and my budget just limit my choice to 8600gt or 2600xt or lower card. and i only have antec TP 430 watt power supply. my system now is amd 64 3000+(socket 939) and 1 gb ram, 2 hardisk, 2 optical drive

    so, can you suggest what card should i take??

    thanks before....
     
  11. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Hey man, welcome to the forums and I'm sorry for the slight delay responding to your queries. I'd say that unless you're going to get an HD DVD / Blu-ray drive for your PC, I'd go with the 8600 GT if you had to choose between the two today.

    You would also get away with an X1950 Pro if you're staying on Windows XP, as there's no sign of DX10 being supported there - you'll have to get Vista as well. I don't think DX9 games will die for a long time if I'm honest - it'll be the fallback graphics mode for a long while like DX8 was, as it wasn't all that long ago when DX8 was dropped. I reckon there's at least another three years of DX9 support.

    Hope this helps,
    Tim
     
Tags: Add Tags

Share This Page