When you say stare at the sun look at the ground, that is exactly what i have done (as that should be the most extreme retinal expsoure). I have tried every combination where it is set to application controlled and where i have set it my self, and still the same result. i know it doesn't sound right and is physically impossible but that is what it seems. it appears to be a lot smoother around the edges of ladders and pipes for example. I know that when i ran it 1280 1024 it was impossible to use AA with HDR, but when i use a lower resolution 1024 768 it will run and appear to enable HDR and use AA at the same time even though it is(like you say impossible). I have ran and used both the Bloom (from a ATI) and the HDR from a Nvidia and seen the difference between the two ( by using a 9800Pro and 6600GT's). Yet at the lower resolution it seems to have the effect of AA whilst using HDR. I'm not trying to start an argument, im just trying to say for information purposes what is happening as i have played around with all the combinations of resolutions, drivers, setting and different graphics cards etc. I'm just really confused
Understandably so. If you're sure you're noticing a difference between bloom and HDR and still have AA on, then you have quite an anomoly on your hands. The only thing I could think of was that you were experiencing a nice bloom effect (which I would expect). But in HDR when you look at the sun and then the ground, the ground should be dark and almost a brown tint to it, then fade back up to the proper (overbright) exposure after a couple seconds. If you're getting this effect, then I have no clue. I agree with bigz...can you post some screenshots? Can you do it at the same places in the game that he did during his reviews? Might make comparison a little easier.
It is all quite strange! I'll see if i can get some screenshots for you. Am quite busy at the moment (painting and decorating a whole house on my own in 7 days!). Wil need to set up an image hosting account too, otherwise i would have posted some with my orignal post.
if you are struggling, email them to me: tim.smalley@bit-tech.net I'm sure I can find somewhere to put them...
I tried benchmarking with HDR last night in FarCry. Looks very good but you can tell it wasnt natively implemented unlike SCCT cause some of the scenes look wrong - especially lit buttons: the volcano level button looks like it has a furnace behind it or something. I wonder if valve will code "HDR" for 24bit only - therefore screwing nvidia owners?? And i wonder how itll look cause it's not natively implemented like FarCry.
It's natively implemented in to the Source engine AFAIK, whereas HDR with CryEngine was a bolt on. NVIDIA can run 24-bit floating point, because they just run it at FP32, as they do throughout Half-Life 2. HL2 uses baseline DX9 spec precision, which is 24-bit floating point precision. DirectX 9.0c includes FP16 blending as part of the specification, but it doesn't include partial precision (i.e. FP16, without blending).
natively built into source engine?? so they just havent allowed us to enable it yet,but it is there ?
Ok. So why didnt they enable it if the Source engine can do it? Cause ATI couldnt do it so they were gonna re-release it once ATI could do FP16 for HDR? Also, since ATI can obviously do FSAA and HDR uses the AA bit of pixel processing, do ATI do FSAA differently (FP24) or something? Does that make it better/worse?? just different?
ATI's current hardware cannot do floating point blending, but there are HDR implementations that don't require floating point blending, and work on ATI's hardware. Valve didn't release it earlier because I think they wanted to 'sell' the game again, with HDR... if you get me. They sell the game once without HDR, and then more people jump on. There is already a HDR 'hack' available that someone wrote. I've got it somewhere on my hard drive, but I've not installed it yet.
if its a real HDR hack pls can u link me to it pls.. but if its the bloom thing i have: http://halflife2.filefront.com/file/HalfLife_2_R8_HDR_Bloom_MiniMod;38224 dont bother linking it
Yea yea i totally get you. They know HL2 will sell, whatever. Then they resell it a year later when they are due to release their first add on. I spose it also averaged out a load of tech support calls and allowed more people to invest in more powerful hardware. Also, i thought the whole point of DX9 was that EVERYTHING was FP. So if ATI hardware cant do FP something then it isnt proper DX9?
No, you're getting mixed up. 24-bit floating point precision is a DirectX 9.0 specification, and 16-bit floating point blending is a part of DirectX 9.0c. However, 16-bit floating point precision, or partial precision is not a part of DirectX 9.0, hence why the GeForce FX is not a 'true' DirectX 9.0 part, because it mainly uses Partial Precision due to speed limitations in the architecture.
can sombody explain how the did the tech demos of hdr for half-life 2 before pixal shader 3 and direct X 9c. i beleave it was done on a 9700 pro or the ti 4600.
Well back to far cry... is it abnormal if I note a marked *decrease* in IQ (especially regarding water on beaches) when setting HDR to 7? I really have to test that out more... but I swear my framerate went up and the image quality went down. Maybe maxing out all settings (save AA/AF, good luck playing FC at 1600x1200 4xAA 8xAF smoothly on any comp) is the automatic equivalent of HDR11?
Chances are that even if it was possible, the cards just wouldn't have the horsepower to do it high resolutions anyways. My SLI 6800GTs (@Ultra, 3000+ @ ~2550MHz) are really getting owned by SC:CT, enabling SM3.0 really does them in (I have no idea if that enables HDR in it as well, haven't played around with graphics settings too much, even at 1600x1200 0xaa 0xaf (or maybe 2xAF, forgot) I'll still have some vaguely choppy moments If they were *good*, they'd code it so you could have SLI cards use AFR to get an effect like that. If you were running at a high enough framerate you might not notice it flickering between FSAA and HDR and it would look like both were on
Valve probably won't release their HDR level until ATI releases their next board which will be able to support HDR