1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Displays DVI vs HDMI vs Displayport

Discussion in 'Hardware' started by Jimmy6, 31 Aug 2011.

  1. Jimmy6

    Jimmy6 What's a Dremel?

    Joined:
    19 Jun 2011
    Posts:
    89
    Likes Received:
    3
    Hi,

    I have a DELL 2408 LCD monitor that I have never used until now. I am about to plug it into a new GTX570 card, but am unsure which input to use. I have the choice of DVI, HDMI and Displayport. Which is best to use, or are they all the same? I know you can use higher resolutions on Displayport, but my monitor is only 1920x1200, so that's not a good enough reason. I know that HDMI can carry audio, but that's also irrelevant to me.

    If it doesn't matter which input I use, why include them all?

    Cheers,

    Jimmy.
     
  2. Cei

    Cei pew pew pew

    Joined:
    22 Mar 2008
    Posts:
    4,714
    Likes Received:
    122
    Either DVI or DP, doesn't matter.

    They include them as various people need specific ports, so it doesn't close the market of prospective buyers.
     
  3. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    For 1920x1200 without audio it is irrelevant which one do you choose. Choose by what cable do you have or which connector you like most. It is a beauty contest in your case :D.
     
  4. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    because some devices only output out of one of them, or you might have more than one device.
     
  5. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    DVI
    -> Max resolution - Single link version: 1920x100
    -> Max resolution - Dual link version: 2560x1600 @60Hz
    -> Max wire length recommended - 5m
    -> Max wire length - 25feet
    -> Video only
    -> HDCP compliant
    -> Ensure color accuracy from the output source.
    -> Max color depth: 8-bit color per channel (16,777,216 colors)
    -> Issues and limitation: Short wire, difficult to plug and unplug, limited number of plug-in and out's before breakage.

    HDMI
    -> Max Resolution (v:1.0): 1920x1080 (Based on Display port alpha)
    -> Max Resolution (v:1.4): 4096×2160 @24fps (higher fps, the less the resolution. Also 3D reduces the resolution, of course)
    -> Max wire length: 15m
    -> Video and Audio (v:1.0)
    -> Video, Audio and Ethernet (v:1.4)
    -> HDCP compliant
    -> Small and compact
    -> Max color depth: 8-bit color per channel (16,777,216 colors)
    -> Issues and limitation: limited number of plug-in and out's before breakage, no system to keep plug in place during movement, royalty fees (0.04$ per cable and port + 10 000$ annual fee for cable and port manufacture, might be more for larger producers), and does not ensure color accuracy from output source.

    DisplayPort
    -> Max wire length (v:1.0 and 1.1) recommended - 3m
    -> Max wire length (v:1.0 and 1.1) - 33m with active wire
    -> Max wire length (v:1.2) recommended - 15m
    -> Max wire length (v:1.2) - 100m with active wire
    -> Max resolution (v1.0 and 1.1): 3840 × 2160
    -> Max resolution (1.2): 4k x 2k
    -> Video only (v1.0)
    -> Video and Audio (optionally) (v:1.1)
    -> Video and (optionally) and Auxiliary (optional) (v:1.2). Aux. source can be anything, like Thunderbolt with Apple computers (yes, that is why it can output a display.. it's not thunderbolt in action.. its just display port), or touch-panel data or USB, or Ethernet, or a combination assuming it follows limitation guidelines.
    -> Ability to daisy-chain up to 4 monitor at 1920x1200 from 1 plug from the GPU (v1.2)
    -> HDCP compliant
    -> DisplayPort Content Protection compliant
    -> Ensure color accuracy from the output source.
    -> No Royalty Fee
    -> Low EMI (important in medical facilities or any sensitive environments)
    -> Low RFI (important in medical facilities or any sensitive environments)
    -> Consumes less power than LVDS (plug system that laptops monitors uses), making it excellent choice for laptops, and also consumes less power when attaching a laptop on an external display via Display Port.
    -> Can drive display panels directly, eliminating scaling and control circuits and allowing for cheaper and slimmer display.
    -> Max color depth: 10-bit color per channel (1,073,741,824 colors)
    -> Self-latching connector, (screw-less design)
    -> Support the most plug-in and out's.
    -> Small and compact like HDMI
    -> Bi-directional data communication between source and monitor.
    -> Fully backward compliant to HDMI and DVI (VGA can be supported, but that is GPU specific)
    -> POTENTIAL: 3D Support @ 120Hz (so, 240Hz 3D)
    -> Super low latency (lower than HDMI and DVI)

    There you go! :D

    Basically use DisplayPort whenever you can, else DVI, else HDMI.
    DisplayPort is slowly replacing DVI.
     
    Last edited: 31 Aug 2011
  6. Jimmy6

    Jimmy6 What's a Dremel?

    Joined:
    19 Jun 2011
    Posts:
    89
    Likes Received:
    3
    Thanks guys, that answers my questions.
     
  7. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    712
    but gtx570 don't have Displayports.....
     
  8. [PUNK] crompers

    [PUNK] crompers Dremedial

    Joined:
    20 May 2008
    Posts:
    2,909
    Likes Received:
    50
    my Asus DCU2 does?
     
  9. Cei

    Cei pew pew pew

    Joined:
    22 Mar 2008
    Posts:
    4,714
    Likes Received:
    122
    Depends on the card maker. I've noted that NVIDIA cards are harder to find with DisplayPort though.
     
  10. MjFrosty

    MjFrosty Minimodder

    Joined:
    3 Aug 2011
    Posts:
    871
    Likes Received:
    23
    My 580GTXs don't. Does it give me a better image quality?


    Didn't think so...

    Wouldn't mind thunderbolt on my motherboard mind you.
     
  11. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    Since the Geforce GTX 200 series, the GPU support Display Port with 10-bit color output. Now, it's just question of, will the manufacture of the card itself WANT to implement Display Port.

    As mentioned, most Nvidia cards don't have Display Ports... my only guess is that manufacture don't want to bundle an adapter to convert Display port to DVI, or when they have billion of DVI ports in stock to use, why not use them on the board as Display Port monitors is still fond only on most descent monitors (which most people/gamer don't have), or maybe they are not encouraged at all, as the Nvidia reference card doesn't use them.
     
  12. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,993
    Likes Received:
    712
    show off :p

    i soooo want to use displayport. DVI is so ancient and big. wish nvidia 600 series cards will be better. rumour has it they are looking at an eyefinity competitor.
     
  13. Rustypouch

    Rustypouch What's a Dremel?

    Joined:
    5 Feb 2011
    Posts:
    71
    Likes Received:
    4
    I have DisplayPort on my monitor but MiniDisplayPort on my Graphics Card. This is a slight ball ache. I can get a big, chunky adaptor for £10/£15 (My monitor came with a DisplayPort to DisplayPort cable, Graphics Card came with no adaptors) or buy a MiniDisplayPort to DisplayPort cable for £5 or so. It isn't a lot of money I know, but am I really going to notice any difference if I make the switch from DVI to DisplayPort?
     
  14. GoodBytes

    GoodBytes How many wifi's does it have?

    Joined:
    20 Jan 2007
    Posts:
    12,300
    Likes Received:
    710
    If you don't use or work with 10-bit software, or work with a computer placed far away from the monitor, where you need a very long DVI cable, then no. It's one of those things, where it's nice to have, but not a must at all.

    Beside your GPU doesn't support 10-bit colors, to even start. So it's a really big: No, to your question. Stay with DVI.

    Keep the Display Port on your monitor for your future laptop to get and use a good digital signal, over VGA. VGA on 1920x1200, to be honest, is pushing it (start seeing blurry text or/and static, depending on your environment and interference level)
     
    Rustypouch likes this.

Share This Page