PSU PCIe Power not connected any problems for installed graphics card?

Discussion in 'Hardware' started by danlevi, 20 Nov 2011.

  1. danlevi

    danlevi What's a Dremel?

    Joined:
    16 Oct 2009
    Posts:
    20
    Likes Received:
    0
    Hi, I've a very quiet HTPC, based around a Gigabyte MA78GM-S2H mobo (ATI chipset).

    I have added a HD6850 with a view to 3DTV and gaming.

    The HD6850 is quite noisy, so I'll be looking for a quiet 3rd party cooler, but also I want to switch the PCIe power to it on and off so I can boot the PC with the HD6850 on or off.

    The mobo has integrated graphics, which is fine for BluRay playback.

    I plan to use dual graphics connection to the same TV and select the source on the TV depending on what I'm doing.

    So for general films or music playback I'll use integrated graphics.
    For gaming or 3D related tasks I'll use the HD6850.

    I've measured the unconnected PCIe plug from the PSU with my multimeter and the three 12v pins have continuity (when the PSU's on) so I think they're from a common rail. I plan to fit a single-pole rocker switch to the side of the case and feed the PCIe 12v through the rocker switch.

    So my query is, assuming there's nothing glaring obviously foolish about the above that I've missed, would a PCIe graphics card installed in a mobo without the PCIe power connected cause any hardware damage, like trying to draw too much power from the PCIe slot?
     
  2. tehBoris

    tehBoris What's a Dremel?

    Joined:
    30 Jan 2011
    Posts:
    616
    Likes Received:
    25
    The motherboard will probably still try and use that graphics card, and the graphics card will either not work or tell you to plug it in or work and crash when graphical load increased or damage it's self trying to run without enough power.
     
  3. danlevi

    danlevi What's a Dremel?

    Joined:
    16 Oct 2009
    Posts:
    20
    Likes Received:
    0
    The mobo bios has an option to specify the primary boot-screen monitor, which i've set to on-board.

    Currently if I leave the graphics card powered, GPU-Z only starts to register GPU load if I connect a monitor to it, when the monitor is disconnected the load drops right off.

    I've had the on-board running at 70% load (bluray playback) while the graphics card isn't outputting to a monitor.
     
  4. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    Up to 75W of the 12v power and all the 3.3V power will still be supplied to the card via the PCIe slot itself so a switch on the cable is not the answer.
     
  5. danlevi

    danlevi What's a Dremel?

    Joined:
    16 Oct 2009
    Posts:
    20
    Likes Received:
    0
    If I don't connect the PCIe power though GPU-Z cannot even recognise there's a card installed?
     
  6. SuicideNeil

    SuicideNeil What's a Dremel?

    Joined:
    17 Aug 2009
    Posts:
    5,983
    Likes Received:
    345
    Is that with or without the monitor/TV connected to the card though?
     
  7. danlevi

    danlevi What's a Dremel?

    Joined:
    16 Oct 2009
    Posts:
    20
    Likes Received:
    0
    'tis without
     
  8. SuicideNeil

    SuicideNeil What's a Dremel?

    Joined:
    17 Aug 2009
    Posts:
    5,983
    Likes Received:
    345
    Try it with;- if the card is not being used as such & isn't fully powered, then it may well not show up in GPU-Z as you found. If however it is required to do some work and output picture to a connected monitor, that's when the interesting/bad things may happen as it isn't being powered properly..
     
  9. Yslen

    Yslen Lord of the Twenty-Seventh Circle

    Joined:
    3 Mar 2010
    Posts:
    1,966
    Likes Received:
    48
    Would it not be easier to use the graphics card for everything and manually lock the fan speed at a low number when you're not using it for anything demanding? Just remember to set it back to auto when it has to work.

    Sent from Bittech Android app
     
  10. noizdaemon666

    noizdaemon666 I'm Od, Therefore I Pwn

    Joined:
    15 Jun 2010
    Posts:
    5,755
    Likes Received:
    563
    Or alter the fan speed curve in something like Afterburner so it runs at a very low fan speed (within reason) when not under heavy load.
     
  11. LennyRhys

    LennyRhys Oink!

    Joined:
    16 May 2011
    Posts:
    6,149
    Likes Received:
    584
    Would it not be much simpler just to replace the fan on the HD6850 with a silent 120mm fan (if it will fit)?

    Most graphics cards come with a plastic shroud on the cooler which actually hampers performance - when I had my GTX470 I removed the plastic shroud & stock fans, and with a 120mm fan on it the performance was quieter and better.

    [​IMG]
     
  12. Tangster

    Tangster Butt-kicking for goodness!

    Joined:
    23 May 2009
    Posts:
    3,085
    Likes Received:
    151
    Looks a lot cooler as well.:rock:
     
  13. danlevi

    danlevi What's a Dremel?

    Joined:
    16 Oct 2009
    Posts:
    20
    Likes Received:
    0
    Just tried it with the hdmi connected, but no "interesting/bad things" happened.
    I think this proves my theory about the power switch.

    On reflection tho i'm not gonna get a 3dtv just yet, so the new card can go in my main pc.

    Also on my current TV the VGA output is a much better resolution for the screen than the hdmi output on (windows desktop).

    Good tip about the plastic shroud, but doesn't the shroud design attempt to blow the hot air out the pci backplate so replacing with a quiet fan would give the case more hot air to exhaust?
     
  14. dunx

    dunx ITX is where it's at !

    Joined:
    1 Sep 2010
    Posts:
    463
    Likes Received:
    13
    Yet ! The PCI-E cable to the GPU is rated much higher than your PCI-E slot 150W vs 75 W IIRC.

    You risk the GPU pulling too much current through the Mo-Bo and frying the 24 pin Molex power connector pins, IMHO.

    dunx
     
    Last edited: 3 Dec 2011

Share This Page