Discussion in 'Article Discussion' started by bit-tech, 5 Sep 2019.
Yet they have still not removed the utterly pointless and actively counterproductive Display 'HDR' 400, 500 and 600 ratings. Ratings that specify performance identical to a SDR panel, because that is all that displays conforming to those standard are: SDR displays, that accept HDR inputs but cannot display them. No less of a lie than selling a 720x576 panel as "1080P" because it can accept a 1920x1080 input signal.
I understand an organization that puts out standards but not one that puts out standards that no one will use because they are in it for profiteering rather than making standards by which every company can freely use. It seems counter productive to create standards and then say "but you have to pay us to certify your equipment", because they will just create 'open source' standards and ignore the pig, and everyone will use different standards thereby making standards completely useless to begin with.
It is a bit like Dolby coming up with more standards that are just blatant rip offs of ideas that are already out there and just renamed so they can profit off free and open source material without giving credit to the real creators, just to stuff more money into the Incorporated theft machine.
And I just wanted to add that this is the exact reason that 3D TV sunk like the Titanic. Everyone had their own standard and it was basically a battle of the patents rather than everyone following a good freely usable standard for compatibility sake.
On the flipside, a standard without any enforcement mechanism to ensure actual compliance is not worth the small square of cardboard the logo occupies on the box.
Everyone agreed on the same interconnect standards barring the handful of very early sets (same problem as early HDTV before HDMI). Everyone had a different implementation of the final display side (sequential with shutterglasses, interlaced or chequerboarded polarisers, autostereosopic filters, etc), but if you have a 3D BD it will play on any 3D BD player on any stereoscopic TV with a HDMI 1.4 or higher input.
Stereoscopic TV didn't catch on because of technical limitations making it uncompelling for home TVs. Correct IPD separation (on the order of 60mm) meant you lost ~120mm of stereo image width regardless of TV size (so a 42" TV lost 13% of its usable screen width) meaning you either had non-stereo images at the edges, a cropped image, or the wrong IPD (poor stereo separation to outright headaches). You also did not have the guarantee of near-planar viewing that you do in a cinema, so apparent IPD would further shift as you sat off-axis, along with lack of a controlled viewing environment meant you would invariably end up with reflections on the screen that would have similar Z-depth to the on-screen images by the opposite separation and introduce stereo fighting. And add to all this the tide of post-process stereo upconversions that flooded the market alongside a handful of films actually shot for stereo viewing, and there was little chance for any actual benefits to surface above the sea of crap.
I think it is actually useful for consumers. That is like saying that intel should remove the i3 and i5 label and brand everything below i7 Pentium.
Just like the bronze and silver rating on a PSU, it is an easy way to compare two monitors.
No, it would be like Intel selling ARM CPUs under the i3 brand. The DisplayHDR 400-600 ratings are NOT HDR - in any way, shape, or form - in the same way 720x576 is not 1920x1080. They do not achieve a High Dynamic Range.
If they called it "DisplaySDR" or similar that would be a useful measure, but calling something "HDR" that isn't HDR is no good.
I was not aware that HDR had an agreed-upon definition. Anyone can make up their own standard. If you don't like the label, ignore it.
Labelling is beneficial for consumers that don't want to try to compare detailed reviews from different sites.
Those two statements are contradictory: if a label is false, than a consumer cannot use it to compare two items. For example, when HDTV sets started becoming available, there were plenty of SDTVs labelled falsely as "HD" because they could process a 1080p input signal without displaying an error. They weren't showing a HD image in any way shape or form and you were not getting a HDTV, but if you went by the label you would think that you were.
False labelling is not beneficial, and is instead actively harmful.
I thought they were labelled "HD ready"?
HD is not a fair comparison because it is an industry-standard and resolution isn't difficult to measure or explain to a consumer.
HD = 1920x1080 or 1080p
What is HDR? HDR10, HDR10+, Dolby Vision, HDR 1000?
HDR400, HRD600, HDR100 and HDR1400 is much easier for consumers to understand.
I hate buying a TV because I don't know what I'm paying for when I go for a series 4-9 or what every a brand wants to use to price bracket. At least this gives some indication.
One day, maybe we'll get screen size & the resolution produced - along with panel type - as a linear progression... or is that too Utopian?
Honestly, picking a monitor is terrible ...
resolution, screen size, panel type, brightness, refresh, response, coating, contrast, colour accuracy, HDR certifications, backlighting, VESA support, stand adjustment, connectivity, freesync and G-sync.
Making things more complicated, it is difficult to compare TN, VA, IPS, Quantum Dots, OLED, plasma
Making things more complicated, some TN can look really good as well.
I bought my dell because it was 27" 1440p IPS and a good local deal (used).
I couldn't find a site that compares all monitors and reviews are inconsistent across sites. I think it is by far the most complicated component choice in a build.
Yet there was plenty of mislabelling regardless.
Except when the panel actually displayed 1280x720. Or 720x576.
It is defined by a similar standards body to HDTV (which also encompasses more than just 1920x1080. You've got colourspace and other things to conform to too).
In addition to things like resolution, colourspace, and bit-depth, the most important HDR requirement that many displays fail to deliver is the dynamic range itself (ironic for the moniker High Dynamic Range): either a peak brightness of over 1000 cd/m2 and a black level less than 0.05 cd/m2 (contrast ratio > 20,000:1) for FALD backlight or multilayer LCDs, or a peak brightness of over 540 cd/m2 and a black level less than 0.0005 cd/m2 (contrast ratio > 1,080,000:1) for OLED or other direct emissive displays.
A standard LCD panel without a FALD backlight (what will inevitably be in a HDR400-600 monitor) will achieve 1000:1 for IPS to 3000:1 (maybe) for VA, or even less if someone is brazen enough to sell a bog-standard TN panel as 'HDR'. With a dynamic range that could be between 15% and 0.1% of a HDR display, these quite clearly are not HDR.
Doesn't matter if it's easy if it's wrong.
HD for TV's was always 'accepted' as 1080p... anything that could only do (effectively) 1080i, was labelled "HD Ready", to differentiate.
Sadly the other way around: there were a million and one "HD" and "HDTV" logos flying around which were applied to any and every TV that could accept a HD signal regardless of actual panel resolution. "HD Ready" only mandated 1280x720 support (it didn't even require 1080p reception capability, only 1080i) and without any requirements to avoid overscan or scaling, or avoid framerate downsampling (e.g. dropping 60Hz down to 30Hz), things we would today take for granted.
If you wanted to find a TV that was "actually HD" as we know it today (1920x1080, 1:1 pixel aspect ratio, 1:1 pixel display, refresh rate at the same frequency as source rate, etc) then at best you had the "HD Ready with 1080p in a tiny subtitle" logo and had to hope it had actually been applied correctly, and was not just the "HD Ready" logo that just happened to be sitting above a completely unrelated "1080" on the marketing material.
Separate names with a comma.