1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Bits DisplayPort: A Look Inside

Discussion in 'Article Discussion' started by Tim S, 22 Oct 2007.

  1. rhuitron

    rhuitron Bump? What Bump?

    Joined:
    15 Aug 2006
    Posts:
    125
    Likes Received:
    0
    That Shot with the 5 Monitors running of the single Laptop was just SWEET!
    I should command such power! Soon, my friends. Soon.
     
  2. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Content protection is an optional part of the spec - DPCP isn't compatible with AACS, afaik, which is why HDCP needed to be rolled in for our friends in the recording industry.
     
  3. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
  4. Edvuld

    Edvuld New Member

    Joined:
    2 Aug 2005
    Posts:
    230
    Likes Received:
    0
    I don't think the audio will pose a problem. For example, I have a native HDMI connector on my graphics card, and I only need to connect the SPDIF pinout on the mobo to the SPDIF pins on the graphic card for it to send the sound. No bandwidth wasted :thumb:

    Might be a whole other thing with HDCP content though, as I guess the sound is "unprotected" ?
     
  5. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    AFAIK it *is* protected. Audio in HDMI is transmitted over the same wires as the video signal, during what would otherwise be blanking times (which is why you can't mix the audio and video with an external connector like you can with SCART); DisplayPort also uses the same wires. HDCP just negotiates a key and then xors-in a pseudo-random bit stream. There's some synchronisation to allow the key negotiation to reset periodically, but I'm not aware of the video being treated separately from the audio.

    HTH.
     
  6. Edvuld

    Edvuld New Member

    Joined:
    2 Aug 2005
    Posts:
    230
    Likes Received:
    0
    Thanks for pointing that out. I'm "glad" their HDCP works as intended. *gough cough*
     
  7. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    :) Glad to be of service. Whether you can route the audio out of your system through a completely separate system is another matter - I don't think I've seen many encrypted means of getting the audio into the graphics card (other than integrating the audio *in* the graphics card - obviously the definition of a "sound card" gets a bit hazy at this point).

    Personally, I've always preferred the idea of putting up with plugging two cables in, since I'd not expect my speaker system to have much to do with my TV - but I probably shouldn't judge until I can afford a lot of HDMI (or DisplayPort) kit.
     
  8. completemadness

    completemadness New Member

    Joined:
    11 May 2007
    Posts:
    887
    Likes Received:
    0
    I really would rather have screws on them, Ive never had a VGA or DVi cable fall out, nor have i had the heads break on my or anything, they just work

    All these clippy things are a pain in the ass, plus the clips can get broken off or easilly damaged, and they just dont do as good a job (also, the clip should be mandatory vesa, jeez)
     
  9. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    709
    the standard rectangle USB plugs hold nice, as they are long. Which seams to be the case for this new plug (from the picture), and it doesn't have any screws or clip.
     
  10. completemadness

    completemadness New Member

    Joined:
    11 May 2007
    Posts:
    887
    Likes Received:
    0
    yeah but USB's have 4 large pins, i think its held in well due to the clip between the metal and plastic "bar" that comes out

    this is going to be more similar to HDMI, and there have been tonnes of complaints about that already
     
  11. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Awesome :D

    I would say that it is more of a combination of the two - it's got a similar pin count to HDMI, but it looks very similar to USB and of course it's got the optional securing clips too. Hopefully they'll be adopted more readily than the SATA cables/connectors that had clips. :)
     
  12. stoff3r

    stoff3r New Member

    Joined:
    20 Nov 2006
    Posts:
    185
    Likes Received:
    0
    clips should be mandatory yes. whenever my computer wouldn't start i jiggled the sata-cables and bang, it started allways.

    one of my mates was over with his old computer, so i could install new components for him. inside it i found a SATA cable with a clip on it. i haven't found any of these in norwegian hw-stores so i offered him money for it. but he wanted to keep it when he saw me staring at it with big wet eyes.
    also, my teacher says professional PA-equipment without clips/screw-on is useless, since they have no room for mistakes in a production. and cables lie everywhere on the floor easily stumbeled upon. this is also one of the reasons the adoption-rate for new equipment is so slow in the professional segment, as the new standards SUCK in practical use.
     
  13. Hamish

    Hamish New Member

    Joined:
    25 Nov 2002
    Posts:
    3,649
    Likes Received:
    4
    no but how many times have you unscrewed a vga/dvi cable and had the nut it screws into come off aswell, or had to unplug one in a really awkward position and spent 10 minutes trying to undo them
    the screws are certainly secure but holy **** are they annoying :p
     
  14. Sol Badguy

    Sol Badguy New Member

    Joined:
    29 Oct 2003
    Posts:
    104
    Likes Received:
    0
    from article:
    "I don't think the technology is going to be the ideal solution for gaming or watching HD movies, as it’s using a selective compression technique that works well when there isn’t a lot of information constantly changing on the display (i.e. when you’re in Windows).
    ...
    "

    although i liked the idea that connecting multiple monitors would be a simple task, i think i'll be sticking with dvi/hdmi
     
    Last edited: 25 Oct 2007
  15. Cupboard

    Cupboard I'm not a modder.

    Joined:
    30 Jan 2007
    Posts:
    2,148
    Likes Received:
    30
    would that mean you could have a little plug in the back of the laptop that if you unplugged it you could use the laptop display as a separate screen or use the connector form that laptop screen to drive and external display?

    Hmm.... my 3.5 Y.O. Acer laptop has DVI (single only, it has the long bar like hole, but not the 4 that go around it) and its not like it was ever that high end.

    Maybe i should upgrade then... my cheap V7 TFT only has VGA lol. (It was cheap, works and is a huge improvement over my laptop, why should I complain?)

    Also, will it suffer from the problem that VGA cables have that with a slight nudge you get a blue/purple/...-tinged screen?

    Speaking as an amateur with some experience, this is a bit of a mixed blessing. On one hand you don't things to come unplugged easily (especially the cables on radio mics... argh!) but I have had people fall over the cables of a sound desk I was using (they should have been there) and bend the input channel panel (on this desk each channel has a little rectangle with all of it's io on it) really quite a lot (still works though unfortunately)
     
  16. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    Generally, IMHO, you either want a fair bit of bandwidth per display (in which case running a PCI Express extender and lots of graphics cards, or xdmx, or - at a push - multiple USB graphics cards seems a better approach) or you're probably not running many displays (and quad-head graphics cards and/or TripleHead2Go type solutions are useful). Selective redraw and compression is usually, IMHO, a bad idea - it makes the screens into expensive thin clients, and always has limitations when you actually want the whole display running at full chatter (why shouldn't you be able to play a video across ten screens?)
     
  17. fluppeteer

    fluppeteer New Member

    Joined:
    23 Oct 2007
    Posts:
    8
    Likes Received:
    0
    Interesting question. One other thing I'd like is to be able to drive laptop panels from a desktop (without resorting to xdmx/remote desktop/MaxiVista) - I'd far rather have a 15.4" WUXGA screen on my desk than a 24" monitor, but I can't get one without paying for the laptop. Whether the internal connection actually has a standard socket is another matter. Is there any benefit to running DisplayPort as an internal connection rather than DVI? I've yet to be convinced, and LVDS seems to have held on quite a long time if wire count is an issue.

    If we got to do it all over again, maybe DisplayPort would be better - and the arguments for using it internally are, for me, slightly better than the arguments for using it as an external connection - but I remain to be convinced. Not that I'm really an authority.

    There have been a few laptops (especially those with workstation graphics) with DVI for a long time - more, if you count docking ports - although Apple have had a near monopoly on dual-link. There's always the various Digital Tigers kit and the VTBook, if you don't mind add-ons (and you can run a VGA to DVI-D converter). The DVI connector is, admittedly, a little large for the average laptop. Incidentally, DVI without the four pins and the cross bar at the top is DVI-D (no analogue, won't work with a VGA adaptor); dual link fills in the missing two rows in the middle of the rectangular block of digital pins. Interesting that Acer wouldn't support DVI-I, though.

    A digital signal is much nicer to handle electronically, but actually - at least at low resolutions - I'd not belittle the modern VGA decoder. The signal itself is pretty shocking (especially if you've got dodgy cables), but even cheap devices have half decent signal processing that can get rid of most of the artifacts. Depends how you're driving it, obviously - a Matrox card at VGA and a short cable really ought to look fine, whereas elderly integrated motherboard graphics at UXGA over a physical VGA switch and a cable extender is going to be a bit dodgy. Some people like being able to fiddle with the contrast and colour balance on the monitor, which many DVI connections don't let you do (because you shouldn't need to). I'm not suggesting we go back to VGA, but remember how many 400MHz 10-bit RAMDACs there are on graphics cards while we all sing the praises of the latest digital technology (there's a reason single-link DVI took a long time to catch on).

    Black is the new purple.

    It'd be nice to have a connector that can hold its own weight in the socket (DVI to VGA dongles, this means you; precious few are S-shaped so they don't stick out so much), but I have to agree about the failsafe unplugging strategy. Back in the day, "don't screw in the cables because you'll catch your feet on them under the desk" was a mantra - and one argument in favour of a CRT is that it's quite hard to pull them off a desk accidentally. With screw-in connectors, you at least have the choice of *not* screwing them in, or choosing which of a chain of dongles should be the break point. Apple's magnetic power adaptor is genius - not that I own one.

    I've always felt that the wireless display is the way forward for avoiding this. By which I mean projectors, obviously, not the "suck up all the wireless bandwidth in a hundred foot radius and broadcast your screen to the CIA/MI5" non-line-of-sight variants which come up from time to time.
     
Tags: Add Tags

Share This Page