1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Bits DisplayPort: A Look Inside

Discussion in 'Article Discussion' started by Tim S, 22 Oct 2007.

  1. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
    Still 60Hz... why not 85Ghz or 150Hz like Analogue? My CRT works fine with those! (I love you, CRT!)
    It's just a slightly improved connector over DVI... I am totally against it, it will just create complications for the average user...
    They will buy an LCD from the store, and go to try and plug it in, and poof no slot!

    Sure the sells-man can inform him or her, but they will go "No port display, give me the normal stuff". So then it won't be adapted.
     
  2. Anakha

    Anakha Member

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    60Hz is because that's the refresh rate the rods in the eyes see at (The colour-receiving ones). However, CRTs are (supposed to be run at) 72Hz+ because the cones (The black/white peripheral vision parts of the eye) see at the faster rate.

    LCDs, however, don't have a visible "Refresh" (As they don't rely on Persistence of Vision like CRTs), so they don't need the higher refresh to comfort the cones.

    However, I don't think DisplayPort, or any other new display cabling format, will really take off until there's one cable that carries everything (Including power) that will give true plug-and-play convenience (As well as cutting down on clutter) there's little incentive for Joe Consumer to choose the new standard over the old ones that have worked for them 'til now.
     
  3. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
    About the Hz, right I know that, but I saw some LCD that allows to set the screen to 75 (choice of 60 or 75 on the windows panel). This was at a store, and I did not look more in details on the difference it offers. So I THOUGHT that it could help MAYBE reduce ghosting... I think their will always bee ghosting as long as it refreshes at 1ms white-black-white.
    Some people say that right now you don't see anything (I don't know as I don't have an LCD), but it reminded me at the time before LCD's... people where saying that 75Hz refresh rate was more then enough to not see flickering... but it's SOOO NOT true. 85Hz is at the point you don't see a difference, and even then... 150Hz, NOW it looks like an LCD or any solid object on your desk. If you still got a high end CRT try 800x600 @ 150Hz if you can... it niiiiiiceee yet it makes thinks BIG (of course) its nice.


    But then you will need a more powerful PSU, as it will be used to power up the monitor.
    Moreover if you have a multi-function screen.... then you won't be able to use it unless your computer is turned on. Maybe you and I leaves our computer on 24/7... but the average person does not. But good thought.
     
    Last edited: 23 Oct 2007
  4. EQC

    EQC New Member

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0
    I'm excited about DisplayPort...but I'm disappointed that it won't be able to beat out DL-DVI's resolution in it's 1st generation. That's just sad.

    Also: it sounds like DisplayPort is going to go through multiple versions too, right? Correct me if I'm wrong...but if I buy a first-generation DisplayPort graphics card, it'll only do about 2560x1600. But, the future generations in a year's time will be able to put out a higher resolution...and perhaps there will be a few generations before they max-out DisplayPort?

    Isn't this going to be very similar to DVI vs. DL-DVI then? In a few years, are we going to have graphics cards and monitors with ports that look basically the same (ie: DisplayPort 1.1 vs 2.0 vs 3.0), but the graphics card's transmitter just can't push enough pixels to drive the monitor at spec?

    If they've already got plans to build in new capabilities, I wish they'd just come to market with them in the first place instead of going through multiple versions. I'm kinda mad now that I can't take my "old" single-link DVI card and get a monitor at 1920x1200 or higher (I think mine's old enough that it only officially supports 1600x1200 at 60Hz). I don't game...so I don't want to spend money on a new GPU...all I really need is a new DVI transmitter on the video card.

    HDMI is going the same way, too -- it's only been on the market for a couple years, really, but it's gone through a few upgrades in capabilities already.

    I can see needing to upgrade the spec every 5-10 years when some new/unexpected/huge-leap feature or capability is needed...but planning to upgrade something as simple as a video connector specification every 18 months implies one of two things:

    1) a planned, money-grubbing upgrade cycle
    or
    2) Manufacturers think that everybody's a gamer...and if the GPU can't push a game at a high resolution, then there's no reason for the card to support it. I wish manufacturers would figure out that some of us don't actually game...but we still dream about big monitors. I don't care if I can't run a game at a huge resolution, but I'd like my video card to be able to run windows at the resolution of top-end monitors in a few years.

    Personally, I'd gladly pay an extra $50 for a graphics card just for the assurance that it'd be able to handle running Windows on a top-of-the-line monitor in 5 years...and I'm a "$50 graphics card" kind of guy, so that means I'm willing to pay double for some future proofing.

    As it stands now, if I want to upgrade to a monitor larger than 1600x1200, I can get Dual-Link DVI now...but that's basically obsolete with Display Port around the corner. So, I can wait a few months for DisplayPort...but it'll probably cost a premium at first, and it's first generation will be obsolete and limit my monitor-buying options in another year. Just based on the principle of this situation, I'm willing to hang on to my little LCD until the potential for a real major upgrade is truly there...maybe it'll be worth it for me when DisplayPort will offer me 3840x2400@60 Hz with good enough color to watch some up-scaled HD movies... but I'm still wondering why manufacturers can't/won't deliver that capability right away.
     
  5. Aankhen

    Aankhen New Member

    Joined:
    15 Oct 2005
    Posts:
    406
    Likes Received:
    0
    Needs moar UDI. :(
     
  6. [USRF]Obiwan

    [USRF]Obiwan New Member

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    Why on earth do they want to display such high resolutions? I can barely see the tiny desktop fonts on my 20" lcd @ 1650 native. Better make displays that increases the size when increasing the res. A friend has a brandnew 17" laptop with native 1920x1200 res and its imposible to do anything on that res, so it is running on 1280 so he can actualy read what he is typing in his word document without sticking his nose to the screen.
     
  7. Xir

    Xir Well-Known Member

    Joined:
    26 Apr 2006
    Posts:
    5,251
    Likes Received:
    88
    Hmmm just a quick question.

    On my LCD, I can't see my BIOS when using a DVI connection, i've got to use a DVI-to-VGA converter, then use the VGA port on the monitor. *don't know why, but it sucks*
    Is this very abnormal? If not, would DP solve this...or would I have to keep an old monitor around just in case ;-)
     
  8. Veles

    Veles DUR HUR

    Joined:
    18 Nov 2005
    Posts:
    6,188
    Likes Received:
    34
    Didn't you read the article? That's exactly what it's not. I don't really see any logical reason why people are against displayport.
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    It's all about change and people fear change... there's obviously more to it than that though. ;)

    I thought about bringing UDI into it, but since Intel's heavily behind DisplayPort, it didn't warrant bringing it up.
     
  10. will.

    will. A motorbike of jealousy!

    Joined:
    2 Mar 2005
    Posts:
    4,461
    Likes Received:
    20
    I like it. VGA and DVI cables are a pain to plug in and remove because of those stupid always hard to get to thumb screws. No ones hands are that small dammit!
     
  11. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    Ooh - one of my favourite topics.

    To dispel a few myths:

    DisplayPort does not, necessarily, increase the bandwidth significantly over existing formats. The maximum bandwidth supported over DisplayPort (there are two options, each running 1-4 lanes), 360MHz, is slightly higher than the alleged limit of dual-link DVI (330MHz), and of the type A HDMI 1.3 (340MHz). A type B (dual-link) HDMI 1.3 connector can provide 680MHz of pixel data, although admittedly these are rare in the wild.

    There is no bandwidth limit specified in DVI. The only mention in the specification is that the switch from single link to dual link has to happen at 165MHz. Many cheap single-link devices can't reach a compliant 165MHz signal (e.g. nVidia's internal TMDS transmitter on the 6800 series). Because most dual-link devices have two symmetrical TMDS transmitters, it's assumed that they'll reach at least 165MHz each, and therefore the limit will be 330MHz; this is the specified limit of Silicon Image's most commonly-used dual-link DVI chips. It's completely legal to run dual-link DVI at 400MHz, and many single link devices could reach over 180MHz (given a good device on either end).

    DVI is, indeed, limited either to 24bpp at dual-link pixel rates, or 48bpp at single-link rates (although under these circumstances "single link" doesn't necessarily mean <165MHz). HDMI can run at 30, 36 and 48-bit (at least in YUV). DisplayPort makes better use of the available bandwidth, but not significantly so.

    3840x2400 monitors are not "coming soon". They've been around since 2001 (as the IBM T221, ViewSonic 2290b, and others). I have one, and it runs off a dual link + a single link DVI connector (or two dual links, or four single links). DisplayPort does not have sufficient bandwidth to drive one at full (48Hz) frame rate. Incidentally, somewhere around 35Hz, motion starts appearing smooth - cinema screens refresh at 24Hz (with a double exposure), and few people complain that there's judder. CRTs need a higher refresh rate to avoid flicker, not to make motion smooth (and a high refresh rate allows smoother matching up with arbitrary game frame rates). LCDs need to avoid ghosting, but don't really need ultra-high frame rates to do it; overdrive and black-frame insertion try to get around this problem.

    1920x1200 monitors have been available for years. The average screen size is increasing, but the average dot pitch is increasing with it - 20" 1600x1200 panels used to be common, but now only a few people (Lenovo, Eizo) make 22" UXGA screens, with most in 24" and some up to 27". Given how long 2048x1536 CRTs were available (and the occasional LCD, too), the push above WUXGA seems to have limited consumer support.

    DisplayPort is not "compatible with DVI, with an adaptor" (at least, a non-trivial one). There's provision in the spec for allowing the DVI signal to be sent over a DisplayPort cable. This saves you the cost of the connector, at the cost of needing an adaptor instead - you still need all the electronics for both formats. In contrast, HDMI is a superset of DVI, and the dongles are purely mechanical.

    Up scaling video content to a higher resolution will not make the display look noticably better - if the information isn't there, it's very hard to reconstruct it without glitches. Ideally, what's wanted is a high resolution gaussian blur to remove the objectionable pixel structure. A CRT, in fact. Displays such as the CMO 3840x2160 56" panel, sold by Westinghouse, Barco, et al. do have the advantage that they can use integer scaling to display both 1080p and 720p. No consumer distribution format is likely to make use of this resolution in the near future, although Sony et al. do have 4096x2160 cinema projectors. >24bpp at 1920x1080p (as provided by either dual-link DVI or HDMI 1.3) is likely to suffice for a long time.

    >24bpp is of limited use to computers. I'll admit that I don't know whether Aero under Vista can be persuaded to run a 30bpp display, but almost every desktop on the planet is only 24bpp. Yes, there are advantages to calibrating LUTs to choose *which* 24bpp you have, but high end monitors have internal calibration tables, and graphics cards since AVIVO have been able to dither a 30-bit output down to 24bpp. Photoshop, the most obvious target for getting the colours right, dithers anyway, and is certainly a 24bpp app; a CMS application with a decent profile is relatively immune to poor calibration. If the display runs at less than full bit depth internally, it's likely that the various dithering schemes will clash - you're better being able to drive the display at 1:1 (one reason why I hate the way some calibrators want to change the LUT on an LCD, even if you're only after a profile).

    Obiwan: high resolution can mean lots of small text, or very smooth large text. It does mean the rectangular pixels are less visible. Some of us like our resolution. My mobile phone (Toshiba G900) has >300ppi, and is very legible. Nobody's forcing you to switch to higher resolution, but if the market is full of 1680x1050 panels, those of us who want enough real estate to see what we're doing are suffering. I have five monitors on my desk, and I could still do with more space.

    DisplayPort is a nice standard, and getting more bandwidth from fewer wires is a good thing, but any nominal cost benefit is going to be outweighed by consumer lack of confidence and the need to support multiple formats on every device (it'd be suicide to remove support for HDMI for a long time). It adds little, if anything, over a type A HDMI 1.3 connector, and less over a dual-link DVI-I connector (which at least contains VGA support and a robust connector). I'd have nothing against it if it had appeared first, but whether it's worth the damage to the display industry (already in the consumer dog house because of the multiple HD resolutions, refresh rates, disc formats, and the HDCP debacle) is highly doubtful - it just doesn't offer enough to compensate for the confusion, IMHO.

    DisplayPort seems to be there because VESA needed to produce something, and Dell felt the need to be a leader in some form of technology. It'll probably happen, but not for the right reasons. I'm kind of hoping that everyone will ignore it until it goes away.

    There's more on the topic at VeritasEtVisus.com, if anyone's interested. (Disclaimer: commercial newsletter; I contribute, but I'm not paid to.) I have no vested interest in the display industry, either for or against DisplayPort - I just don't want to be paying for an unnecessary standard, as a consumer.
     
  12. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Hello fluppeteer, thanks for signing up to comment.

    I believe that the article here does not mention that DisplayPort offers a significant amount of bandwidth over DVI and that it can drive higher-resolution displays than 2560x1600, as that is the maximum 16:10 resolution supported by DisplayPort v1.1. Gen 2 is where VESA expects to move to 3840x2400 @ 60Hz with 24bpp.

    On that point, you also mentioned 3840x2400 displays - you're right that they have been readily available for a long time and they can be driven via three DVI links (DL+SL DVI)... but what you didn't mention is that the displays you've listed are all for professionals and are not designed to be affordable enough for the gamers that bought one of the "affordable" 30" / 2560x1600 released in the past couple of years (obviously, that's a very small market).

    Bit-tech's readers are predominantly enthusiasts and gamers and it's from that perspective that I looked at DisplayPort and current display technology. Obviously, current 3840x2400 displays are not suitable for gamers; instead, they're designed for medical imaging, photoshop work and other tasks that would benefit from not only high resolutions, but also great image quality - response times aren't important (and no, I don't believe in the millisecond myth - below about 16ms on a good monitor, I cannot detect ghosting or 'input lag').

    Windows is inherently limited to 24bpp for the time being, and will be until Microsoft either releases a patch or releases a(nother) new operating system. In some ways I was surprised that Vista didn't offer higher colour depths, but then there's no point without support from the other side. All of the Radeons since R520 have been able to output 30bpp over DVI and as you say can dither to 24bpp, but we're still waiting for industry support from the other side.

    Between bit-tech and TrustedReviews, we've had our fair share of HDMI-equipped televisions and consumer electronics devices going through our offices and we have certainly had problems with HDMI connectors falling out of devices as soon as you move them. DisplayPort doesn't differ a great deal from HDMI 1.3 at the higher level (obviously deep down there are quite stark differences), but there are known royalty fees associated with it and on that point, I'm interested to know why you believe that DisplayPort isn't royalty free (I saw it was mentioned in one of your newsletters).

    I cannot see HDMI disappearing and given its prevalence in the CE market, it would be suicide as you rightly say. However, it was never designed with PC displays in mind - it was designed for HDTVs and the convergence happens when you mix PCs with the home theatre. It's a moving target at the moment and there's a lot of murky water... I think we're going to have to wait and see how good or bad DisplayPort's backwards compatibility is, because I think that, if anything, is going to be its major stumbling block in the industry.
     
  13. iwod

    iwod New Member

    Joined:
    22 Jul 2007
    Posts:
    86
    Likes Received:
    0
    Wasn't one of the main Adv of Display Port is optional Optical Cable. Which in theory allow you much longer cabling as well as much higher bandwidth?
     
  14. walle

    walle Well-Known Member

    Joined:
    5 Jul 2006
    Posts:
    1,640
    Likes Received:
    44
    Seriously now...I'm scared ;)
     
  15. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    Hi Tim,

    Sorry to put words into your mouth about bandwidth; there's been a lot of FUD about DisplayPort's performance in the past (mostly before HDMI 1.3 and the days of common dual-link DVI), so I may be seeing it where it doesn't exist! However, without the ability to send more data (resolution, bit depth) to the monitor, DisplayPort's advantages are negligible. The latching connector is better than HDMI's, although probably not so nice as the OpenLDI connector on the 1600SW; I'm concerned over its physical strength and cat-proofness, and I don't buy that the size of the connector makes much difference to a 42" HDTV. DisplayPort possibly has a better DRM mechanism, but has to support HDCP as well anyway for compatibility with AACS. The claim that DVI requires internal conversion within a flat panel is fallacious, so the benefits over LVDS are questionable. That any connector can have 1-4 lanes and one of two speeds (although I believe the cables are all four-lane) slightly scuppers the universal compatibility claim (although we'll have to see real devices), and talk of sending data over fibre optics also applies to DVI (see Gefen), along with Firewire, PCI-e, etc. It's not *worse* than HDMI, but any new standard should be worth the cost of implementing it, and I've not seen evidence of that yet.

    I'm glad to hear that a second generation of DisplayPort will have a bandwidth boost to give it some benefit on high resolution displays. I'm concerned that this will give a second bout of incompatibility and confusion, though - I'd be happier if, rather than going to DisplayPort while there's little benefit to doing so, there was a push to get the next generation, with some real improvements, out as soon as possible.

    The 48Hz T221s (3840x2400 screens) are also used for film editing, so moderately decent refresh is vaguely important - hence the jump to film-friendly 48Hz over the original devices' 41Hz. I know nVidia have mentioned them in a gaming context; maybe such devices are going to be appearing again soon, and I have my fingers crossed, although I prefer the resolution at the current 22" size rather than as a 30" desk-hog. A nine megapixel monitor puts serious strain on the graphics card, though, and requires correspondingly greater effort in the game design if you don't just want to see perfectly sharp triangle edges and smeared textures - I doubt we're at the stage where it's worth game developers aiming so high, yet, even with procedural content. I'll believe it when 30" screens are more common.

    Re. Windows and colour depth - Direct3D, of course, has supported 30-bit content for a long time. Since Aero should be rendered entirely under Direct3D, in theory it ought to be possible to persuade it to run in 30-bit mode, although how one would present it with a 30-bit window is another matter. Here's hoping it'll get there eventually - it might matter for video playback.

    As far as royalties are concerned - I'll have to paraphrase a bit here (I'm just a contributor, and it's really the editor of the newsletters who has the beef with DisplayPort), but my understanding is that there's no guarantee been made that the DisplayPort standards group won't charge for any proprietary technology. More importantly, DisplayPort's royalty situation assumes that HDCP isn't being used, and for so long as HDCP seems to be a requirement for both HD-DVD and Blu-Ray (even though it's trivially crackable and just stops home users from being able to, e.g., botch their own image correction software into a HTPC - but that's another rant) I would expect most systems to have HDCP. HDMI is subsidised if combined with an HDCP licence anyway; the licence cost per unit is negligible compared even with the physical cost of the connector.

    I'd like to see monitor manufacturers and graphics card manufacturers putting pairs of HDMI 1.3-grade (340MHz) TMDS transmitters on their devices, and when shared with dual-link DVI, making the HDMI type B connector appear in the wild (preferably via a dongle from a high frequency dual-link DVI-I connector, retaining RAMDACs for VGA output). I'm quite happy to see DisplayPort do well, but I'd like to see the switch to the new standard giving something worthwhile for us consumers - given the cost of extra connectors and cables, and the inconvenience of compatibility issues. I'll support any step forward to an improved video format, I'm just concerned that DisplayPort, as currently presented, seems to be a step sideways. It's quite possible that I'm just paranoid, though, and that it'll all be wonderful really. :)
     
  16. stoff3r

    stoff3r New Member

    Joined:
    20 Nov 2006
    Posts:
    185
    Likes Received:
    0
    text-size in windows is really not a problem since you can choose "extra large fonts" in the display-properties right?

    when i got my new monitor, a samsung 226bw, i was sort of dissapointed that i could see the pixels if i look at it from 20-30cm. as i see it my icons on the quick launch etc dont have many pixels to work with. i'm guessing 20x20 pixles, and this makes them blocky. this is not a problem for me but i would like a bit better dpi next time.
     
  17. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    Windows (even Vista) doesn't scale everything perfectly, but yes, you can. You can also set the "dpi" settings in the display settings dialogue, but that doesn't necessarily work properly either (depending as much on applications as Windows itself). Vista claims to make a point of variable dpi support. Fortunately, many apps let you scale the window content dynamically, which is usually what you really want.

    Having peered at them in a magnifier, the quick launch icons appear to be 16x16 (I probably should have known this). I'm fairly happy with the ppi of my 1600SW (17", 1600x1024), and much more so of my T221 (204ppi). 1680x1050 made a little sense to me as a lowish 19" resolution - at least it's not 1280x1024 - but at 22" it's way too big for my tastes. Thank goodness for Lenovo (re-)introducing WUXGA 22" screens, which might not look completely out of place next to the average 2048x1536 19" CRT...

    The slow move to WUXGA, and the increase in screen size rather than resolution, suggests that the 165MHz DVI limit (1600x1200, or 1920x1200 with reduced blanking) is not the limiting factor in the desktop monitor market. If DisplayPort gives us all 2560x1600 screens - although preferably smaller than 30" ones - then it has my blessing. Given how long dual-link DVI took to take off (and other than the 30" screens and Crossfire to CRTs, it's still rarely used) I'm remaining cynical. The 30" screens are a long way short of the 330MHz dual-link DVI "limit", btw, so maybe another resolution bump will happen at some point.
     
  18. EQC

    EQC New Member

    Joined:
    23 Mar 2006
    Posts:
    220
    Likes Received:
    0
    I think, in windows, if you want to run a high resolution but still have visible text, all you have to do is right click the desktop --> properties --> settings --> advanced --> general --> DPI Setting. Set the DPI to a higher value (if your monitor is small with a high resolution) to make the text larger but still utilze the "native resolution" of your LCD.

    High resolutions are good for certain things -- like writing computer programs...so you can see more of the code on-screen without having to scroll. Even working in Excel, it can sometimes be much nicer with more on-screen all at once.

    Doing any task that requires you to run multiple programs in multiple windows, being able to "tile" the windows on screen instead of having them one-behind-another can be a boon for productivity.

    It's even good for working with or just viewing pictures -- with everybody taking 5 megapixel photos these days, it'd be nice if you could see the whole photo all at once without shrinking it down.
     
  19. Veles

    Veles DUR HUR

    Joined:
    18 Nov 2005
    Posts:
    6,188
    Likes Received:
    34
    Very true, I don't even bother with them, most times it just means I can't unplug the cable because the damn screws are too fiddly.
     
  20. Max Spain

    Max Spain New Member

    Joined:
    18 Jul 2007
    Posts:
    43
    Likes Received:
    0
    According to a quick web search, that appears to be incorrect. 1 2 According to those links, while display port already had Display Port Content Protection (optional?), the new "feature" in the 1.1 spec was to add HDCP :clap: I love all the "Trust" these computer hw manufacturers have in their customers.
     
Tags: Add Tags

Share This Page