Discussion in 'Article Discussion' started by Tim S, 27 Feb 2007.
I think your looking for "whiter"
Rule #1: don't proof news stories after 11pm Timmy!
Definitely filed under smart business strategy.
This will also be a bonus to consumers as Dolby has shown a history of successfully spreading technologies and also of a rapid reduction of prices for said technologies over short periods.
I wonder if it's related to the 3D Cinema stuff Dolby announced AGES ago: http://blog.wired.com/gadgets/2006/08/dolby_goes_3d.html?entry_id=1531278
BrightSide do have High Dynamic Range projector technology too...
Bluephoenix makes me happy. I certainly hope that Dolby successfully gets this tech on the market fast and cheap.
I only thought about this a few days ago and wondered when this technology was going to hit the mainstream. This is a smart move by Dolby and I look forward to seeing what they can achieve with this aquisition. 28 million is a small price to pay for technology with this much potential.
I was wondering what the hold up was, it would be nice to see this stuff in person.
I'd say it isn't, at least until we get theater-quality sound from cell phones!
Sweet. When will we see the front page article on this?
Good on Dolby. All I can say is "bring it on".
There's a Whitepaper on how it works but we haven't seen a demo yet. I believe the results aren't quite as good as the LCD TV (how could it be?) BUT it still promises to be lightyears ahead of current projectors, which are notorious for having rubbish black-level performance
All I can say is that I want one.
How long will it take for licensing and production?
Will we see these out by the Holiday season?
I would gladly exchange a major extremity for a true Hi-Def home theater with 7.1 surround sound and HDR projector.
You namedropper you
Ooooh, can't wait for my 1080p HDR set. Hopefully within 12 months now
I think within 12 months is perfectly reasonable, though at what pricepoint, I daren't guess. 1080p is already a reality, albeit a fairly pricey one. BrightSide's HDR IMLED technology has nothing to do with the LCD panel, so it can not only work with any existing 720p / 1080p panel, but might almost be considered a "drop in" solution.
Of course, there are the mechanics (and economics!) of the backlight itself but with Samsung already demoing what we believe to be the same tech then Time to Market and Pricepoint are really just a question of marketing strategy.
They're definitely going to appear at the high end first and then trickle down across the range. Eventually CCFL backlights will be a thing of the past and we might then see static LED backlight models and IMLED models....
Samsung certainly aren't going to release a display the requires water cooling for it's backlight That certainly can't be helping the release date!
I'd like a Samsung monitor with IMLED, WQUXGA, DisplayPort, HDMI, VGA, Component, and a high colour gamut.
The latest generation don't require watercooling. In the past 18 months there has been a lot of improvements in the efficiency of the super-bright white LEDs used. The latest generation are twice as bright / half the power kind of thing.
By the time this stuff goes mainstream, it shouldn't be any harder to cool than maybe a regular plasma screen (which often have active aircooling)
Me too. And I'm not paying over £100.
Joking aside, white LEDs help broaden the colour gamut, so that is pretty much a given.
Why would you want VGA on that next-next-gen monitor? That's like wanting a PC that's compatible with a 9600 baud modem, isn't it? I'd just have a bank of HDMI and DisplayPort connectors, and maybe Component for XBox 360.
WQUXGA - had to wikipedia that one. 3840 x 2400! I guess you'll be wanting quad cryogenically cooled Geforce 9900 GTX cards to power it, then. Incidentally, I think you'll need more bandwidth than HDMI or DisplayPort can afford - the highest spec (1.3) only stretches to 10.2Gbps, and DisplayPort offers four 2.7Gbps channels (10.8 Gbps total). At a resolution of 3840 x 2400 x 24 bpp, that only gives you enough for 46fps, not even allowing for audio bandwidth and overheads. Not good. Dual-link HDMI, anyone? Add to that the fact you'll want more colour depth for HDR (say 36 bpp, maybe 48), and you're reducing that fps still further. I'd guess you're also pushing the limits of component and VGA, though I don't really understand analogue video well enough to say for sure.
Maybe it will be out in time to play Duke Nukem Forever.
Sweet. 1080p is getting quite reasonable now - you can get a 40" 1080p Sony 40W2000 for not much over £1200, which is quite amazing. I don't think I'll bother with an LCD until I can get one in that range. The whole idea of 1366 x 768 confuses me - surely that means that all HD content (720p / 1080i / 1080p) has to be scaled to get a full frame image? Don't understand the point. Why is that, and not 1280 x 720, the standard res for a 720p LCD?
In technical terms related to the actual panel, true, though of course there are complications with the display driver circuitry, because there's an interrelationship between what the panel does and what the backlight does, to get an even, consistent image across the screen. I guess it needs more horsepower to do this at 1080p than (say) 720p. That said, given all the signal processing / image enhancement features modern LCDs already sport, I doubt it will be too taxing.
I've been wondering the same thing for years. Interestingly, I can remember when there were lots of 1280x720 TV's. Then, suddenly, there were a few 1280x768 TVs -- I assumed that that 16x10 ratio happened to be friendlier for PC inputs, given that "768" and "1280" were already portions of standard computer resolutions (1024x768 and 1280x1024). Then, soon after that, they jumped up to 1366x768 -- I kindof assume that was to get rid of the "black bars" that a 16x10 TV uses to display a 16x9 image, because 1366x768 is back to a 16:9 ratio.
Yeah...I've never really had a chance to compare image quality on a "768p" set vs a 720p set, but I can't imagine the scaling is very accurate, given the strange ratio. 720p and 1080i/p scale with a factor of 1.5, which shouldn't be too bad....but the ratio between 768 and 720 is 1.066666, which seems like it's going to show more artifacts.
I'm kindof thinking nowadays that perhaps "Average Joe" thinks more pixels is always better....so a 1366x768 set will always sell better than a 1280x720 set sitting next to it. So, manufacturers can't just go back to the format that makes sense without risking sales.
Separate names with a comma.