I've been very pleased with my Dell 2405 since I bought it almost 5 years ago, but one thing that always bugged me was fine banding on greyscale gradients.. well.. any gradient really. I just assumed this was the way it was, and forgot about it. However, after doing some research I've come across something disturbing. I saw LOTS of forum posts regarding the new Dell U2410 also showing banding. Hmm.. I thought. This can't be right. It turns out that when un-calibrated, it's fine... but as soon as you calibrate it, it gets banding. As it transpires, this is NOT a fault of the screen, but the GPU. Most GPUs offer a 10 bit LUT for VGA output, and use only a 8bit LUT for DVI output! So the downscaling produces dithering. I tested this. Used my 2405 over a VGA cable, and NO banding whatsoever. Back to DVI and banding is back. While in DVI I cleared all monitor profiles, and went into nvidia control centre, and manually adjusted gamma. When gamma is centred... no banding... as soon as I adjust gamma.. banding starts to appear. So.. there you have it. All video cards are crap.... unless they have a 10bit or higher LUT for DVI output... which none have apparently. I am wondering though... what about HDMI? basically.. it would appear that most of these newer IPS monitors, while being great.. will band badly over DVI... but it's not the screens fault. Looks like I'll be going back to VGA. No way am I putting up with banding in a new monitor. Can anyone tell me... does the GTX295 have a 10 bit LUT for DVI output? Can't find this info anywhere. I've come to the conclusion that unless you have a high end graphics card like a Quadra etc instead of a gaming card, there's no point getting an expensive monitor because you just can't seem to get rid of this damned banding unless you have a genuine 10bit LUT for digital output. My 2405 calibrated beautifully with a delta E of less than 0.5. Until I can get hold of a HP LP2480zx for less than a grand, or at least a decent panel with hardware LUT built in then I'll not bother upgrading. I'd like the wider gamut of the U2410 but as banding will be just as bad.... forget it.
I am not sure I understand. If you complain that the U2410 (and other high class monitors) doesn't show certain colors as "smooth" but rather a type of dithering, than that is because your monitor is at "Adobe RGB" or "sRGB" color profile. When set to either the monitor will not try to porduce a color it can't do, or that will appear outside of the supported color layout. Remember that Adobe RGB and sRGB are 12-bit color profiles, if I recall correctly... the monitor has a 8-bit panel. If you want that affect out, you better spend over 2000$ for a 22 inch , where the panel is 12-bit. The fix to your problem is to not pick Adobe RGB or sRGB color profile. What I sis in my case is calibrate the Custom profile to match Adobe RGB.. this way I get the same great colors (not 100% match but close (I did it with my eyes - by switching between profile continuously until I get a near perfect match). When I need color accuracy, i use Adobe RGB or sRGB depending on the purpose. Only high-end Quadros and (and the equivalent from ATI) cards offer more than 8-bit per channel color output. - HDMI is DVI with digital audio passing through. - Display port is the same as HDMI without royalty fees, and better plug. The reason why the effect that you see disappear with VGA is that the monitor goes (diisgend that way) "oh your using VGA.. therefor you don't care one bit about color reproduction as you convert a digital to analogue and then I'll have to convert it back to digital... Also, already converting the signal will make input lag.. and now you want color accuracy.. that is even more input lag, and this is a not a TN panel.. so even more input lag.. you know what.. screw color accuracy so that you don't have this crazy input lag" NOW, if you really mean Banding.. liek you see step in your gradients.. than that is your GPU driver bug or Windows. 1- Make sure you have at least Vista on your system. 2- That you don't have a low end GPU. 3- Have the latest drivers for your GPU (make sure also that your UNINSTALL the old drivers restart your computer, use ccleaner to clean the registry (to remove any left out configuration from the previous driver) FIRST, and THEN you can install the latest drivers). I don't have that banding problem on my side on any of my computers including my laptop which features a Quadro NVS 160M.
Exactly. I am not using my eyes... I'm using a colorimeter, and when you profile a monitor that does NOT have a hardware LUT you have problems because the monitor has a 12 bit software LUT... but the graphics card only has a 8 bit LUT in hardware. Result? Banding. There is NO way around this. The Dell monitor, like many others.. including mine... is absolutely fine when NOT calibrated, but when you DO.. banding. It's nothing to do with sRGB, Adobe RGB.. it's because you're using the GPU's 8 bit LUT. (sigh)..... Win7 64bit GTX295 Latest drivers. The problem is that the colour information is being handled by the 8 bit LUT of the GPU. The only way to avoid banding is to use a monitor with a hardware LUT for calibration. No.. you don't have the banding... because you have not calibrated it. Despite it's narrower gamut, I think I'll go for the NEC monitor. See this thread. Also... try this. Set this image as your wallpaper (this is how I noticed the problem) Notice the light area to the left of the car? Should be smooth. Go into your nvidia control panel/display/adjust desktop color settings.. click on use Nvidia settings... and adjust gamma manually. Notice the banding starting to appear? That's because you're using the GPU's 8 bit LUT to send information to the monitors 12bit software LUT... result.. dithering. The same thing happens when you calibrate a monitor - you're using the GPU's 8 bit LUT to modify the gamma, amongst other settings. You have no banding because you're not doing that. If I disable my custom ICC profile I get no banding either... but then my colour and contrast are crap. The only way to avoid this is to use a card with a true 10 bit LUT like a high end Quadro or Parhelia card.... or use a monitor with a dedicated 10 or 12 bit hardware LUT.... or use the VGA output. [edit] I'm not slagging off the monitor... i'm bitching about why high end desktop GPUs only have a 8bit LUT for digital output! Because of this I'll have to make a compromise in gamut and get the NEC screen, or up my budget to at least £1800 for the HP LP2480zx.
In your case, you got the wrong stuff... you need a high end Quadro and a high-end LCD monitor, as you are professional. The Dell/HP IPS/PVA panels are high-end consumer monitors.. you know those you enjoy colors, but don't need critical color accuracy, like myself.
No I didn't. I also use this computer for everything else... including watching HD video, editing, and gaming. What use would a Quadro card be to me? Just because I'm a photographer doesn't mean I have to use this machine for ONLY editing images. I don't regard a IPS panel with 96% Adobe RGB as a consumer panel at all! Why would a consumer need 96% of Adobe RGB? We've had this discussion before: Unless it's calibrated, using a wide gamut monitor will result in inaccurate colours. I do think it's extremely odd that there are these H-IPS panels with massive colour gamuts and NO hardware LUT though... I mean... thinking about it, they are only really for bragging rights and e-peen... after all.. as I've just discovered, they're actually useless for high end work. If you don't do high end work... why do you need 96% of Adobe RGB? It's a beautiful screen... I just don't get who it's aimed at. It shouldn't really be you... as someone with no interest in colour accuracy shouldn't be using a wide gamut display... and it's not me because it doesn't have a LUT built in. Who the hell is it for? My issue here is the lack of 10 bit LUT for 3D graphics cards. Here's me thinking the posterisation of gradients was just a shortcoming of my 5 year old panel when after all it's not. Anyway... i've decided. NEC LCD2690WUXI for me. The only screen with a decent gamut, hardware 12bit LUT and also a decent price.... or just bite the bullet and get the U2410 and accept that I'll get banding perhaps... dunno. In reality, few photgraphs display banding as it's only an issue on absolute, mathematically linear gradients, and as I don't do CAD work, or use illustrator for anything, that's never going to happen in a photograph. Food for thought.