Hi, as many of you know, NVIDIA runs at 3D clocks when you use 2 displays on one card. In my case, it's a big difference (51/101/135MHz@0.96V vs 750/1500/1950@1.0V), which makes card idle temperature rise from 44C with single monitor setup to 55C to dual monitor setup - that brings increased noise too, and that's what bothers me most, not the temperature itself. Because there is no way to lower the clocks and temperature, i'm looking for different solutions. The questions are : If i add another NVIDIA card (something lowend like NV8400), will it still force 3D clocks for the main card or not ? Do any of you have experience with DisplayLink adapters ? How much performance loss is on CPU when i use the secondary display just for basic stuff like console output, IRC & IM chat, downloader apps ? CPU is Core i5 750 btw. Any other ideas how to solve this problem ?
perhaps look into the usb monitors? if you're just going to use the 2nd screen as a static display for info apps, it may be a good solution for you. there are several models available from samsung and others ranging in 7" to 24" that use usb cable(s) as both video data and power. these don't use the gpu at all, as they use the cpu and software instead. you can probably find one in about the same price range as a video card, and even possibly sell off your current 2nd monitor to offset the cost.
Unless the only option is running a single nvidia card, the best solution is either to buy a second card or to get an AMD one - the newer ones will run 3 (or even more) monitors while remaining very cool and quiet. I'm pretty sure the clocks will stay at normal levels if you have two cards each connected to one monitor, though you'll increase temperatures/noise just by having the second card in there.
USB displays = Displaylink adapter. That was why i asked about performance loss when using them. Today i tested the clocks with secondary card (GT240) for secondary monitor and found out the answer to 1st question - GTX570 nicely downclocked when the second display was connected to second card. And funnily enough, a friend is selling his passive XFX 8400GS (which fits in color scheme of my case too), so problem is solved. Of course, having lower clocks with one card would be awesome, but because nvidia does not make this possible, this is the second best solution.
I never knew about this. This is bad news for me since I too use dual screens & want a nice quiet machine. I have a spare passive 8600GT knocking about (might need a cap soldering...). I really hope it's not going to be necessary to run a whole extra GPU just to keep noise down though. Seems like such a waste of power quite apart from anything else. Also I'd want to share GL contexts between windows of the same app on different displays, not sure what the implications there are...
It's in release notes, unfortunately. See page 29 : http://uk.download.nvidia.com/Windows/266.58/266.58_Win7_WinVista_Desktop_Release_Notes.pdf I'm pretty sure the last statement is a lie, or even if they do need higher clocks, they don't run at full clocks all the time.
Hmmm. Am I right in thinking this is an aspect in which AMD is superior? Might mean it's time for a bit of a rethink... p.s. I started a thread about this on SPCR http://www.silentpcreview.com/forums/viewtopic.php?f=19&t=61404
8400GS is going to use 19W in idle (= that is what it will do 24/7). Not the lowest figure, but probably still a lot less than the difference between 50/101/135 vs 750/1500/1950.
I never knew this either. I have two (non-identical) graphics cards and three displays. One card (the one with two displays) does run a little on the warm side when folding so I think I'll try swapping the second display on that card to the other card and see if that helps me.
127W with GTX570 for one display (downclocks to 51/101/135), 8400GS running but disconnected. 139W with GTX570 for one display (downclocks to 51/101/135) and 8400GS for second . 178W with GTX570 for both displays (runs all the time at 750/1500/1950). That is 39W difference between using two cards for two displays and 51W difference between one and two displays. And those 51W equals more thermal loss, means noisier cooling. And all i did was just unplug the DVI cable or change where it is plugged.
I'd forgotten about this. I noticed this in a BIG way when I had a 480GTX. It was far louder and hotter when I connected a second monitor, so much so that I had to remove the second monitor whenever I wasn't using it, and to be honest, I was glad to see the back of that problem. I have an ATI card now. I really don't see why Nvidias cards need to up the clock speed in order to drive a second monitor.
i remember my ATI 5870 went to low 3D clocks (400MHz on core clock) when a 2nd monitor is connected. so nVidia driver readme statement wasn't a lie. although, that was driving 2560x1440 + 1920x1200, pretty impressive amount of pixels. solution is to run a 2nd card. any odd 2nd dedicated card will be better than a USB adaptor. USB adaptors are horrible and should be killed off. the guy sitting opposite me at work as one, and it's unbelievably slow.
Low 3D clocks are still low 3D clocks, not full 3D clocks. That is the main issue - my GTX570 pushes the full 750/1500/1950 at 3D core voltage from the moment i connect the secondary display to it. As you can see, even with the 12W overhead of 8400GS it is still 39W bellow one card use.
FWIW, it's not just Nvidia cards - AMD cards are the same. My 6870 still runs at full 3D speeds (900MHz) when two displays are connected, and only clocks down to 300MHz with one display connected.
The 5870 did this in 400/1200 dual display clocks instead of 157/300 single display clocks (which resulted in green dots issue for my HD5870 btw). The issue with NVIDIA is that they use full 3D clocks (at least in my case they do). And that makes a huge difference.
yes, you've said it above smc's reply. and smc is saying ATI new generation are also using full 3D clock speeds. which is also confirming nVidia's statement was not a lie. just get a 2nd card and you should be fine.
Would it not be possible to use IGP for secondary display assuming motherboard chipset supports it? edit: just realised faugusztin already addressed that on the SPCR thread As I have said, I'd favour a solution that meant GPU resources were still common to all displays though.
Aditional info : http://www.legitreviews.com/article/1461/19/ If you have two monitors running at same resolution, then it does matter if they are same model from same manufacturer or not. If they are same model from same manufacturer (for example two pieces of HP LP2475w), the card downclocks. If resolution or display model is not the same (for example one HP LP2475w and one Benq T2210HD like in my case), card runs at 3D.
Hmmm. I wonder if, for example, a ViewSonic VP2365wb would play nicely with my Nec EA231WMi? Same panel...