1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD, Intel drop D-Sub support

Discussion in 'Article Discussion' started by CardJoe, 10 Dec 2010.

  1. Fizzban

    Fizzban Man of Many Typos

    Joined:
    10 Mar 2010
    Posts:
    3,334
    Likes Received:
    118
    I've not used the VGA connection since the end of 2004. I'm more concerned with them phasing out DVI quite honestly, as I still use that. I do have a DVI - HDMI adapter somewhere though, so all is not lost!
     
  2. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    Ya Im a okay with it as well.
     
  3. Eiffie

    Eiffie New Member

    Joined:
    11 Oct 2010
    Posts:
    364
    Likes Received:
    2
    It's been about a good 5 years since I've used anything with a VGA connection in my home setup. Everything I use is either DVI or HDMI right now. I'd be more than happy to see the VGA connection go out the window and stick with the later two choices. I could see how some people might be upset by this news but to be honest, it's not a very big deal as far as I see it. Others might have a different opinion but 2015 is still a ways off, there is still time to buy your precious VGA connections before they are fazed out for good if that's your thing.
     
  4. supermonkey

    supermonkey Deal with it

    Joined:
    14 Apr 2004
    Posts:
    4,955
    Likes Received:
    201
    There is a difference between devices designed to support an HDMI specification, and the cables that carry the signal. Read here for additional information, especially the page about HDMI 1.4 and the page that talks about what the different specifications mean.

    The current HDMI 1.4 specification supports 4K resolution, which means any device that is HDMI 1.4 compliant can output a 4K picture - far beyond 1080 resolution. All HDMI cables are built to the same specification (with the exception of the added Ethernet and Audio return channel introduced in Specification 1.4), so they will all transmit a 4K signal. If you're not sure whether or not you need an Ethernet or Audio return channel, then you don't need one.
     
  5. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,945
    Likes Received:
    17
    On Intel boards it's through a 3rd party chip. Intel southbridge chips haven't supported IDE for a while.
     
  6. Isitari

    Isitari Member

    Joined:
    6 May 2009
    Posts:
    345
    Likes Received:
    20
    oh dear. Every single school I have ever worked in for the last 5+ years only runs off VGA projectors. Even the schools with newer equipment could only possibly run them off DVI at a push, certainly not HDMI or DisplayPort.

    There may be trouble ahead especially with the budget cuts.
     
  7. Delphium

    Delphium Eyefinity enabled

    Joined:
    18 Mar 2007
    Posts:
    1,406
    Likes Received:
    35
    This is not such an issue really wth the use of displayport, as there are DP>VGA/DVI/HDMI adaptors available, I use a couple myself to power projectors in the living room from my 5870E6 card which uses nothing but Mini-DP connecitons.

    Displayport 1.2 spec there is 17.28 Gbit/s (720 MHz) of avaiable bandwidth able to provide 4k x 2k picture, 7.1 audio, ethernet, power, usb and not forgetting the ability to daisy chain or split by use of a hub 4 indipendent monitors @ 1080p off of a single port from the pc/laptop.
    It is also a royalty free connector.

    So really there is no worry in most cases as there are cheap enough adaptors should they be needed.

    On the flip side, as an audio visual eningeer, where I have done a number of audatorium/cinema/flight sim and many other installs where we would work from a drum of VGA cable and solder on our own connectors.
    Often so that it is easyer to draw cables though thick walls and tight corners and spaces specially inside up/down of walls, the newer digital connectors are a bit more tricky to solder due to the tight pin spacing although we often then switch to using cat5 balons this starts to put the price up again.
     
  8. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,945
    Likes Received:
    17
    I wonder if Nvidia will still have VGA on some low end cards? They'd have captive market for people that don't wantvto replace expensive projectors etc.
     
  9. llamafur

    llamafur WaterCooled fool

    Joined:
    27 Jul 2009
    Posts:
    859
    Likes Received:
    21
    Finally, took them long enough. Now do away with IDE.
     
  10. Anakha

    Anakha Member

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    As a thought, isn't the LCD panel on every single laptop out there connected using LVDS? How's that going to work, then?
     
  11. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. New Member

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    IDE is gone most new mobos are sata only but you can still buy mobos w/ IDE if you must have it.

    VGA should have been gone when HDMI and DVI were released.
     
  12. PingCrosby

    PingCrosby New Member

    Joined:
    16 Jan 2010
    Posts:
    392
    Likes Received:
    7
    D-Sub? Bloody ell, I thought it stood for something else, I'd better get that graphics card out the bath.
     
  13. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,590
    Likes Received:
    231
    come on, get rid of DVI while we are at it. from my understanding, LVDS is still needed for DVI and HDMI.

    Displayport is the way forward.

    be honest here, which one would you rather use? one that is easily plugged in or one that requires you to screw two fiddly thumb screws, and graphics card often have loose screws after a couple uses.
    [​IMG]
     
  14. NethLyn

    NethLyn Member

    Joined:
    24 Apr 2009
    Posts:
    971
    Likes Received:
    17
    My first ever PCI-E graphics card, a Radeon, came with a DVI to HDMI converter back in 2008 so it's been on the cards for a long time. Since it's a HDMI 1.3 port on the monitor and god knows what on the GPU, I'm perfectly happy with DVI for the minute as the cable was in the box, gratis. That AMD press release is all about trying to flog new cables yet again.

    VGA/D-SUB is what, 15-20 years old at least? By all means phase it out, but since every single card across all 4 buses has had this interface and both AMD/Intel were happy to put it on their integrated GPU boards depending on price up to early this year, it's a "zombie" that will be kept alive through conversion - there isn't going to be a mass junking of early TFTs that were VGA only for a start.
     
  15. tad2008

    tad2008 New Member

    Joined:
    6 Nov 2008
    Posts:
    332
    Likes Received:
    3
    Seeing VGA finally relegated to being a part of PC history will be one of the truly historic moments for some of us.

    The loss of DVI won't be such a bad thing as don't think it really ever took off as much as could have been hoped. Hearing of Intel UDI to replace it just muddies the waters for the consumer and is not something I will ever care to use.

    For me HDMI is the way to go and to have PC's properly incorporate audio in to the HDMI stream would be a marvel in itself. For those who prefer to keep their audio and video seperate, display port already offers the next best alternative.
     
  16. Aracos

    Aracos New Member

    Joined:
    11 Feb 2009
    Posts:
    1,338
    Likes Received:
    47
    TBH I don't think it's fair to say DVI requires the screws, I never use mine because I'm always swapping out DVI between my xbox and pc and I never get it falling out unless I actually move the cable. But yes I'd rather have displayport but TBH I'd rather nvidia got off their arse and supported it before others get phased out.
     
  17. j_jay4

    j_jay4 Member

    Joined:
    23 Apr 2009
    Posts:
    515
    Likes Received:
    14
    Surely the vast majority of desktop owners are using graphics cards with DVI and VGA connectors and monitors with the same. This is therefore eventually going to result in a grpahics upgrade that requires a new monitor. I'm not gonna be happy to fork out more money to replace a 23 inch monitor that is perfectly adequate as monitors tend to stay around much longer than computer upgrades.
     
  18. Iorek

    Iorek New Member

    Joined:
    18 Jul 2006
    Posts:
    63
    Likes Received:
    0
    While I cant' complain about old interfaces being phased out, some of us still use them... TV's tend to have VGA rather than DVI, granted there are more mobo's now for media pcs with integrated hdmi so this is less of an issue. Why should we be forced to upgrade monitors? If its not broken.... I'm still using my old 1280x1024 Samsung from about 6 years ago, works fine and I see no reason to upgrade. While, yes, in that time I've had several graphics cards, I feel that monitors / tvs are definitely designed to last much longer than the hardware the drives them.
     
  19. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,590
    Likes Received:
    231
    i think the point is not to force people to buy new monitors. The point is to phase out this old technology. AMD and Intel are two of the biggest players in producing graphics processors, and their decision will mean producing boards without D-sub/VGA interface.

    if you have old display that still want to be used, then it's all possible through displayport, just need an adaptor.




    continue on subject of adaptors, why are DVI still around? displayport with a passive adaptor can drive any average monitor. i really wish these technology companies can be as brutal as Apple. just one day release a HTPC motherboard with two sockets: HDMI + DisplayPort. then bundle adaptors to DVI and sell active adaptors to VGA. it will make everything sooooooooooo simple.

    now, with D-SUB/VGA being phased out, DVI will be installed across cooperate projectors. and that technology is also due to be phased out by Displayport in coming years. why not just go straight to DP?



    to be brutally honest, i don't like the fact monitors are dropping D-SUB support. i wish those analogue ports will be around forever. reason being those ports are so easy to drive. any microcontroller with some resistors can be used to drive them. DVI/DP/HDMI on the other hand requires a lot of protocol complications.


    so, i'd say: drop all ports and standardise DisplayPort for computer, HDMI for TV's. and keep VGA/D-sub on new monitors.
     
  20. Picarro

    Picarro New Member

    Joined:
    9 Jun 2009
    Posts:
    3,331
    Likes Received:
    134
    I can't fathom why we don't just have 1 standard for AV equipment? Something like a HDMI/DP *******-child. Would make everything so much easier.
     
Tags: Add Tags

Share This Page