1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Solution to dual-monitor setup and graphics card clocks

Discussion in 'Hardware' started by faugusztin, 29 Jan 2011.

  1. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    Hi,

    as many of you know, NVIDIA runs at 3D clocks when you use 2 displays on one card. In my case, it's a big difference (51/101/135MHz@0.96V vs 750/1500/1950@1.0V), which makes card idle temperature rise from 44C with single monitor setup to 55C to dual monitor setup - that brings increased noise too, and that's what bothers me most, not the temperature itself.

    Because there is no way to lower the clocks and temperature, i'm looking for different solutions. The questions are :
    1. If i add another NVIDIA card (something lowend like NV8400), will it still force 3D clocks for the main card or not ?
    2. Do any of you have experience with DisplayLink adapters ? How much performance loss is on CPU when i use the secondary display just for basic stuff like console output, IRC & IM chat, downloader apps ? CPU is Core i5 750 btw.
    3. Any other ideas how to solve this problem ?
     
  2. IvanIvanovich

    IvanIvanovich будет глотать вашу душу.

    Joined:
    31 Aug 2008
    Posts:
    4,870
    Likes Received:
    252
    perhaps look into the usb monitors? if you're just going to use the 2nd screen as a static display for info apps, it may be a good solution for you. there are several models available from samsung and others ranging in 7" to 24" that use usb cable(s) as both video data and power. these don't use the gpu at all, as they use the cpu and software instead. you can probably find one in about the same price range as a video card, and even possibly sell off your current 2nd monitor to offset the cost.
     
  3. sb1991

    sb1991 What's a Dremel?

    Joined:
    31 May 2010
    Posts:
    425
    Likes Received:
    31
    Unless the only option is running a single nvidia card, the best solution is either to buy a second card or to get an AMD one - the newer ones will run 3 (or even more) monitors while remaining very cool and quiet. I'm pretty sure the clocks will stay at normal levels if you have two cards each connected to one monitor, though you'll increase temperatures/noise just by having the second card in there.
     
  4. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    USB displays = Displaylink adapter. That was why i asked about performance loss when using them.

    Today i tested the clocks with secondary card (GT240) for secondary monitor and found out the answer to 1st question - GTX570 nicely downclocked when the second display was connected to second card.

    And funnily enough, a friend is selling his passive XFX 8400GS (which fits in color scheme of my case too), so problem is solved. Of course, having lower clocks with one card would be awesome, but because nvidia does not make this possible, this is the second best solution.
     
  5. xinaes

    xinaes What's a Dremel?

    Joined:
    17 Jan 2011
    Posts:
    103
    Likes Received:
    2
    I never knew about this. This is bad news for me since I too use dual screens & want a nice quiet machine.

    I have a spare passive 8600GT knocking about (might need a cap soldering...). I really hope it's not going to be necessary to run a whole extra GPU just to keep noise down though. Seems like such a waste of power quite apart from anything else. Also I'd want to share GL contexts between windows of the same app on different displays, not sure what the implications there are...
     
  6. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    It's in release notes, unfortunately. See page 29 :
    http://uk.download.nvidia.com/Windows/266.58/266.58_Win7_WinVista_Desktop_Release_Notes.pdf

    I'm pretty sure the last statement is a lie, or even if they do need higher clocks, they don't run at full clocks all the time.
     
  7. xinaes

    xinaes What's a Dremel?

    Joined:
    17 Jan 2011
    Posts:
    103
    Likes Received:
    2
    Last edited: 29 Jan 2011
  8. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    8400GS is going to use 19W in idle (= that is what it will do 24/7). Not the lowest figure, but probably still a lot less than the difference between 50/101/135 vs 750/1500/1950.
     
  9. *brian*

    *brian* What's a Dremel?

    Joined:
    23 Jul 2010
    Posts:
    47
    Likes Received:
    1
    I never knew this either. I have two (non-identical) graphics cards and three displays. One card (the one with two displays) does run a little on the warm side when folding so I think I'll try swapping the second display on that card to the other card and see if that helps me.
     
  10. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    127W with GTX570 for one display (downclocks to 51/101/135), 8400GS running but disconnected.
    139W with GTX570 for one display (downclocks to 51/101/135) and 8400GS for second .
    178W with GTX570 for both displays (runs all the time at 750/1500/1950).

    That is 39W difference between using two cards for two displays and 51W difference between one and two displays. And those 51W equals more thermal loss, means noisier cooling.

    And all i did was just unplug the DVI cable or change where it is plugged.
     
  11. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    13,022
    Likes Received:
    618
    I'd forgotten about this.

    I noticed this in a BIG way when I had a 480GTX. It was far louder and hotter when I connected a second monitor, so much so that I had to remove the second monitor whenever I wasn't using it, and to be honest, I was glad to see the back of that problem.

    I have an ATI card now. I really don't see why Nvidias cards need to up the clock speed in order to drive a second monitor.
     
  12. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,992
    Likes Received:
    711
    i remember my ATI 5870 went to low 3D clocks (400MHz on core clock) when a 2nd monitor is connected. so nVidia driver readme statement wasn't a lie.

    although, that was driving 2560x1440 + 1920x1200, pretty impressive amount of pixels.

    solution is to run a 2nd card. any odd 2nd dedicated card will be better than a USB adaptor. USB adaptors are horrible and should be killed off. the guy sitting opposite me at work as one, and it's unbelievably slow.
     
  13. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    Low 3D clocks are still low 3D clocks, not full 3D clocks. That is the main issue - my GTX570 pushes the full 750/1500/1950 at 3D core voltage from the moment i connect the secondary display to it. As you can see, even with the 12W overhead of 8400GS it is still 39W bellow one card use.
     
  14. smc8788

    smc8788 Multimodder

    Joined:
    23 Apr 2009
    Posts:
    5,974
    Likes Received:
    272
    FWIW, it's not just Nvidia cards - AMD cards are the same. My 6870 still runs at full 3D speeds (900MHz) when two displays are connected, and only clocks down to 300MHz with one display connected.
     
  15. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    The 5870 did this in 400/1200 dual display clocks instead of 157/300 single display clocks (which resulted in green dots issue for my HD5870 btw). The issue with NVIDIA is that they use full 3D clocks (at least in my case they do). And that makes a huge difference.
     
  16. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,992
    Likes Received:
    711
    yes, you've said it above smc's reply. and smc is saying ATI new generation are also using full 3D clock speeds. which is also confirming nVidia's statement was not a lie.

    just get a 2nd card and you should be fine.
     
  17. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    As you can read above, i got one (XFX 8400GS).
     
  18. xinaes

    xinaes What's a Dremel?

    Joined:
    17 Jan 2011
    Posts:
    103
    Likes Received:
    2
    Would it not be possible to use IGP for secondary display assuming motherboard chipset supports it?
    edit: just realised faugusztin already addressed that on the SPCR thread
    As I have said, I'd favour a solution that meant GPU resources were still common to all displays though.
     
    Last edited: 1 Feb 2011
  19. faugusztin

    faugusztin I *am* the guy with two left hands

    Joined:
    11 Aug 2008
    Posts:
    6,953
    Likes Received:
    270
    Aditional info :
    http://www.legitreviews.com/article/1461/19/

    If you have two monitors running at same resolution, then it does matter if they are same model from same manufacturer or not. If they are same model from same manufacturer (for example two pieces of HP LP2475w), the card downclocks. If resolution or display model is not the same (for example one HP LP2475w and one Benq T2210HD like in my case), card runs at 3D.
     
  20. xinaes

    xinaes What's a Dremel?

    Joined:
    17 Jan 2011
    Posts:
    103
    Likes Received:
    2
    Hmmm. I wonder if, for example, a ViewSonic VP2365wb would play nicely with my Nec EA231WMi? Same panel...
     

Share This Page