Discussion in 'Article Discussion' started by Tim S, 5 Aug 2005.
I guess I’m not enough of a game graphics connoisseur because those screens all look the same to me…
/me wonders where the other thread went
I've played with my monitor brightness and I am seeing that the images on the left are lighter than the images on the right on page 2 & 3, while page 4 shows that the lighting on the right doesn't have as larger range of colours as the screens on the left.
In short - there's more light on the left on P2 & P3. Page 4 - I artificially adjusted the images on the right to try and get as close as possible to the background lighting on the left images in an attempt to see the differences between the two High Dynamic Range Lighting implementations.
Oh, ok. I took a second look and I think I see the difference now. Pretty detailed stuff.
There's a big difference in the images, but the best way to illustrate I think was the photoshop part.
Take an image from each card, and make the light sources equal in color and intensity in photoshop. Then you can see how much different one is from another. In ATI's case, to make things equal to NVidia's bright spots, the whole image had to be much brighter, illustrating a smaller range of colors. To get a white equal to the white from NVidia, black was no longer black, but more like a flat, dark grey.
It's like having one spectrum that goes from 0-100, and another that goes from 20-80. Overall, doesn't look like that much of a difference. But to try and make that 80 turn to 100, you have to make the 20 turn to 40...a BIG difference. All due to the lack of precision.
Problem with hardware used.
If you're going to compare image quality, shouldn't the comparison also be done on similarly priced/ranged cards?
ie x850 xt vs a 6800 Ultra?
The comparison between a x850 and a 7800 is similar to comparing an nvidia 5900 Ultra with a x800 xt. Different generations at different price ranges.
Your article is subject to criticism on this matter.
The image quality is essentially the same on both 6800 Ultra and 7800 GTX without Anti-Aliasing enabled. The thing that improved inside the GeForce 7800 GTX is that it is much more efficient at processing floating point blends, meaning that HDR is actually playable at a reasonable resolution.
I disagree with what you are saying - the 'best playable' image quality delivered by a single reference-clocked 7800 GTX is comparable to a 6800 GT SLI in this title. However, there were no direct comparisons made to the 7800 GTX in terms of playability so I don't see where the problem lies...
If we want to get really technical, the Radeon X850 XT PE is 7 months later to market than the GeForce 6800 Ultra and the GeForce 7800 GTX was released 7 months after the Radeon X850 XT PE...
To further expand on this, ATI's HDR implementation doesn't seem to make use of a tone mapping pass. If you disable the tone mapping pass on a Radeon X850 XT PE, the image quality does not change at all. However, if you disable tone mapping on a GeForce 6/7, you will notice a difference and the image will look very close to what ATI are achieving with the Shader Model 2.0 path that they've helped Ubisoft to code.
The reason behind this is that I believe ATI are using a 16-bit integer buffer gives an HDR range of 256-1 at 8-bit precision, while a 16-bit floating point blend gives a HDR range of approximately 2,000,000,000-1 at 10-bit precision.
Basically, the tone mapping pass used on a 16-bit floating point blending turns a 2,000,000,000-1 HDR range in to a 256-1 HDR range so that the monitor can display the image. Thus the tone mapping pass makes absolutely no difference to the final outputted scene on a Radeon X850 XT PE because the image is already at 256-1. You could call the implementation a low-dynamic range technique, because it is certainly limited by its 'HDR' range of colours.
The generation doesn't have anything to do with this article, as long as they have the same options. The X850XT PE doesn't have any significant SM advances over a 9800Pro, and the ATi SM2.0 integration into this game was the entire focus of this article.
It had nothing to do with performance, or the amount of AA/AF that could be added to the image; it was completely about the addition of SM2.0 functionality into a particular game title, and the comparison of it against an nVidia card with SM3.0 -- something ATi has yet to add support for in their GPUs. The 7800GTX has SM3.0, the X850XT PE has SM2.0.
Also bear in mind that the X850XT PE is ATi's highest model graphics card out atm, and the 7800GTX is nVidia's highest model graphics card atm. Making them the perfect cards to compare, as they are the best each company has to offer currently.
Basically, your post was useless imo, because the article was well thought-out and compared the two offerings well....
Welcome to Bit
I think the article well written too. I just feel any comparison between hardware has to be done with some controls. Price range is one of them.
This allows people 1) to calculate fps vs $$ and 2) compare image quality on products they are considering.
When consumers are in a market for a card there is a $30-50 range that they look at. Not $100+ with a x800 xt and 7800. Although you can infer sm3.0 is better than sm 2.0 from your article...it's not certain that it carries over to the 6800 ultra. I'll have to google other sites it to double check. Still, I suspect a 6800 ultra has better image quality.
In a few months, try a comparison between a 6800 Ultra vs a R520 and you may see comments about that.
Let's go back to the 9700 pro and GF 4200 then... There were 4 months when the 5800 was in limbo.
And btw the post wasn't useless. The pecking order for determing the truth is hardware manufacturers, journalists, general public. At each stage questions are expected and people should be glad to answer them unless they're hiding something.
I'll reiterate, the article was good. I'm just checking to see if it can be better.
I can understand what you are saying regarding the frames per second per dollar. However, there were no comparisons made between the playability of HDR on the Radeon X850 XT PE and the GeForce 7800 GTX - the image quality settings were exactly the same in both cases to make the comparison as close as possible.
It would not have mattered if I had used a GeForce 6600 and compared it to Radeon X850 XT PE because, as long as the settings and resolution were the same, the quality of the output would have been the same. Comparing 6800 Ultra to R520 would not be valid in this case, because R520 will be a Shader Model 3.0 architecture with Floating Point Blending and it will not use Splinter Cell: Chaos Theory's Shader Model 2.0 path. However, R520 vs 6800 Ultra might be a valid comparison, as both are 'first implementations' of Shader Model 3.0.
The article was not designed as a playability comparison.
I think this is a little bit offtopic, but I couldn't find anywhere else to post. Anyway It's just a stupid question regarding Shader Model 2.0 in Chaos Theory.
I own a 9800 PRO and I've used the "check for updates" option in the game to download the 1.00 to 1.05 patch. After the installation I enter the settings menu but the game won't allow me to change from SM 1.0 to SM 2.0.
Is the game using them anyways without me having to manually select the option from the settings menu? (because the option is not available, it just says SM 1.0 and It won't allow me to click there)
Thanks very much, sorry for my english, hope you can help me out.
Separate names with a comma.