I decided to update the rig with a 4930K, so I thought a re-review was in order to examine the apparent CPU bottleneck.Unfortunately I then went off the rails and picked up a new 4K monitor. Since I always like to see exactly how new hardware performs, I thought I'd share the results. The new rig: The new mobo requires that I put all three GPUs together, however this isn't too much of an issue since they now run cooler- I guess this is down to reduced framerates. I did have to slightly reduce the overclock on the GPUs (orginally they were running at 1150MHz core, 7100MHz memory) as they were unstable (something I found out after running all the benchmarks). I've overclocked the 4930K to 4.2GHz with no hyperthreading with only a couple of slight voltage increases (Vcore @ 1.31V) that are well within intel recommended specs - to push any harder required massive increases in voltage that I've decided wasn't worth it for now. The monitor The Asus PQ321QE is on the right and a Dell 3007WFP-HC is on the left. The Dell is now approximately five years old, so has apparently faded. I still love it though! The PQ321QE was comparatively easy to set up as the drivers picked it up straight away (I am using the included displayport cable). I only had to switch on 60Hz mode as by default the monitor itself is set to 30Hz. However, the monitor isn't as easy to use for the following reasons: 1) Currently the monitor cannot display until Windows has reached the login screen. In my case, I even had to have a secondary monitor otherwise I would get no display at all. So a secondary monitor is essential in my opinion. 2) Activating vsync is a necessity to stop glitches. Tomb Raider had weird, juttery performance (something the FPS values don't show) until vsync is activated. This also solved the TressFX problems I was having but the performance hit bought the FPS down to borderline playable. 3) The monitor acts like attaching a monitor to a laptop, so if it is off the PC doesn't know it exists. This essentially means that it seems to 'reinstall' itself every time Windows starts up, making all screen blank out for a few seconds before seeing the login screen. 4) The drivers are actually treating the display as two. This leads to odd little occurances - e.g. when reaching the login screen for Windows, the monitor briefly alternates showing on each side of the monitor before 'stabilising'. Further to this, I actually had half the screen crash when playing Tomb Raider, though this was with the unstable overclock settings. 5) If the PC crashes, the 4K monitor must be disconnected, the PC rebooted to windows, the computer turned off, the 4K monitor reattached and then the PC started again to 'reset' the drivers. This will be pain for finding new overclocks. At one point I actually had to reinstall both the GPU and sound card drivers after one crash. 6) Drivers: There are only one set of WHQL drivers for Nvidia that support 4k, those being 327.23. I had a brief go at using the latest beta drivers (331.40) but this reduced performance and would crash after a few seconds of running the benchmarks. Ultimately, I think that to run a 4K monitor at the moment, the user must have a relatively good understanding of using a PC. Results Unreal 3 based games comfortable run at 60Hz. Arkham City looks awesome. Actual game usage All the reviews I've seen say that antialiasing becomes unnecessary when at such a high resolution and I'm inclined to agree (I tend to sit about a metre away). If you start peering closely then you can start to make out pixels. For this reason, it is possible to get better performance out of a 3840x2160 display compared to a 2560x1600 display. Pictures Batman: Arkham City Crysis Crysis 3 Doom 3 Tomb Raider Supreme Commander Heaven 2 Total Annihilation Minesweeper Happy? Yup, but definitely poorer! So what now? Naturally I'm going to replay the Crysis series (though I'll probably miss out Crysis 2). Also, due to reduced temperatures, I may rexamine overvolting the GPUs. Heh, and to think this all happended because one of my 660Tis started showing artefacts.
the link below go to answer 12. theres a link to a bios update that should sort out your boot problems on your new monitor https://forums.geforce.com/default/...king-at-3840x2160-60hz-on-nvidia-c/?offset=15
Thanks dude. Just logging my experiences to add to our collective knowlede! To be honest it doesn't bother me as I want to use the 3007 as a secondary monitor anyway. I'm going to adopt the 'if it ain't broke, don't fix it' approach for the moment. I've got through Crysis and took a few screenshots (see below). I urge those interested to look them over - the alien caverns look like an artist's final rendering of a concept rather than 'just' screenshots of a game. Time to go outside into the real world - I may have been up all night and not eaten much . Then I can crack on with Warhead.
This is normal for most 4K monitors as 4K@30Hz is achievable over 1 HDMI 1.4/DP 1.2 connection, but 60Hz requires double bandwidth so it needs two multiplexed. I guess this is due to the nature of the display. Currently as there's not enough cable bandwidth it has to be treated as two displays multiplexed together. I will ask our LCD team if there's anything they can do to the firmware before the drivers kick in but it's unlikely as each internal display driver drives half the display. To make anything central would still need both controllers. Can you use the BIOS with it? What about adaptive VSync? This is just how Windows handles the display unfortunately. Nvidia drivers for 4K still need work it seems. An unfortunate side affect of early adoption. Plus you're running multi-GPU which - while necessary for FPS - only adds complication to the formula.
No, but my secondary screen is there so I'm not worried. That's what I'm using. Not a problem - it's only for a couple of seconds anyway. I expected teething issues but I haven't encountered any insurmountable problems. The monitor is excellent at any rate Bindi! Please give my best wishes to the team at Asus! I should have mentioned that the benchmarks are with everything set to maximum. Tomb Raider can achieve minimums of ~76FPS without antialiasing and TressFX. Also, older games run like a dream. Mirror's Edge currently has problems with SLI but can run very well off a single 780 for example.
Turning down a few (often pointless) settings can make the vast majority of games playable at 4k like resolutions (3600x1920 in my case) on a single overclocked high end GPU (7970 and above). Crysis 3 and no doubt BF4 will require a few settings to be dropped but they can still be enjoyed. Absolute max settings of course takes us into 4 x Titan territory though. Nice review Pete. I love the idea of a 4k screen but I also love having multi monitors (I work on them all the time). Hmmm 4k Eyefinity.... I think even 4 Titans would just curl up into a ball and start rocking.
a number of places are commenting the same about NVidia driver - if you want 4k , then its AMD for now.
Seems to be working fine! Absolutely right. I prefer to max out the visuals but if I drop them only by a bit everything starts flying. In regards to eyefinity 4k, check this out! I haven't heard any reason to prefer AMD or Nvidia for 4K.
Just thought I'd post some screenshots from Crysis 3: I did have a go at overvolting the GPUs but +38mV only resulted in an additional +50MHz on the core, so I didn't see the point when I already had a decent overclock on stock values.
What's wrong with it? If you have a better site feel free to suggest it and I'll use it. Nope. I had a trio of 660 Tis. A few more screenshots, this time from Batman Arkham City. Got to complete it before Origins comes out!
I gave it a quick try but it resized my images (despite having selected 'do not resize'). Imageshack seems friendlier to use as well. More Arkham City: