In case you haven't seen it yet, the guys over at Anandtech got to do some very light benchmarks on a 4K monitor. And well... the results are rather staggering. On 2 of the 3 games they tested (not a very good sample size, but it's all we got to work with), it took 4 GTX Titans to barely scrape a 60fps average. You can see from the screenshot of the monitor as well that scaling is going to be an issue. Those icons are the size of your fingernail, and the text is as small as ants. I'm not sure about the rest of you, but while I welcome the increase in resolution, I'm pretty scared of what it's going to do to all the people who have no clue what they're getting themselves into. I can almost see the complaints already... "I got a new monitor and now I can't see anything on my screen and I can't play any of my games!" "Did you get a 4K monitor?" "Ya, my old one was 5 years old and pooped out on me and the sales rep said this was the newest thing." "Well here's your solution then: go return your monitor and get a 1080p one until Windows decides to support scaling for large resolutions and graphics cards catch up."
I love the graphic showing two 1500 W PSUs used to power the 4 Titans. Here I was last night griping about how small my case was (Antec Three Hundred) when I was rearranging things to fit a spare GPU for dedicated PhysX. 4000 resolution seems a bit overboard, but then again, I'm more than happy with running 1920 on a 27" monitor.
Damn! I will get a 4k screen for photo editing.. when they get realistically priced of course. Looks like I won't be gaming at native res then.
Even videos pretty taking on hardware. Got a gopro hero 3 the other day and my brothers laptop (1st gen i3) can't play 4k at all.
Who cares about gaming when is 4k pron going to be out. But seriously I can hear them now 'I got two titan's and it still runs like ****'.
Just turn down a few settings (often ones that make no noticable image quality difference*) and the GPU requirement will drop significantly. My resolution is only slightly lower than this 4K monitors and I can game happily on a single overclocked GTX780. A Single overclocked HD7970 was also enough for the most part but the 780 offers up around 25% more performance which makes everything better. I am currently doing a big benchmark / review / writeup comparing a 7970Ghz (OC) and a GTX780 (OC) in Eyefinity / Surround, the results so far are very interesting. For example: Crysis 3 - 3600x1920, High Settings (Very High Textures, Shadows on medium), SMAA 1x 7970 @ 1200/6400 - 38.9FPS Avg (27 min) GTX780 @ 1189/7000 - 49.7FPS Avg (34 min) Perfectly playable (on the 780) and looks great. *Sleeping Dogs is a great example of this. Everything maxed with "Normal" AA leads to a framerate in the mid to high 70's, increase this to the "Extreme" AA setting and the framerate tumbles into the low 20's. I can't tell a sodding difference between the two when actually gaming - apart the fact the latter is completely unplayable.
Not very surprising as it has the same amount of pixels has 4x 1080p monitors. Edit: As mentioned by xaser, some settings improve the visual quality and do not cost much resources. Others affects performances a lot but do not add much (high level of AA for example or some shadow settings). Everything lies is balancing thing to maximes perf vs visual quality ratio.
People are getting so hung up on fps/performance they are forgetting what's important here: what do games actually look like at 4k? My guess is... pretty average, considering how far you'd be sitting from the screen. How many people sit close enough to a 30" (or bigger) display to notice all the sumptuous extra detail crammed into almost 150 px/in? (That's right, one-hundred-and-fifty pixels per inch, compared to the likes of the U271x series which have "only" 108.) Even to consider these monitors for desktop gaming is silly - if you sit close enough to see all the eye candy you're back to 2K resolution, LOL.
One thing I am fascinated by on those graphs is how (relatively) well the single 7950 does. Obviously completely unplayable in most cases, but is this an illustration of 3GB vs 2GB?
Unfortunately they don't, if you read the article it says how bad the Microstutter was with that setup. Pure FPS numbers are worthless really.
4k gaming on a 30hz monitor would not be for me, if you dont play anything but mmo and stratagy games it might be ok. Play a few fps games and its going to be unplayable.
You'll need a Titan foursome to watch pron, if you look closely you can actually see the STD's being transmitted Ultra HD, TOO MUCH INFORMATION
I think for once Anandtech didn't put much thought in to this. Why exactly would you be gaming at that resolution with AA turned up to 11? I'm at 2560x1440 and use 2x, which is perfectly fine, so at 4K you should use the same or even not need it. That change alone will save a whole pile of GPU grunt, as AA is expensive to implement. I reckon once that is done, 4K can be done with a pair of GTX 780s comfortably (on everything except Metro 2033, which is a terribly coded game, again, why use that as a benchmark?) - sure it's expensive today, but by the times the screens hit mainstream you'll have that performance in a single card.
I have started turning off AA on quite a few games, simply because most of the times I don't see any difference in the game but do see a difference in performance.
Like I've written in the comments over at Anand... 4k resolution is total bollocks and offers no real improvements over the current 1080p standard. High density screens are for mobile devices, where you look at them from 30cm distance. And for this area it's allready hitting the same wall as the 1080p resolution for TVs. Usually you sit some 70cm away from your PC-screen, or in a more ergonomic way, the length of your arm. At this distance tho, you won't really see single pixels anymore on a 22-24" 1080p screen, especially when talking about movies or games instead of still images. For a TV in the living-room, the rule of thumb is 1m distance for every 15". So you'll sit at some 3m distance when looking at a 40" TV. At this distance you won't notice too much differences between 1080p and 4k. So the only thing a 4k-resolution does really is to unnecessarily increase bandwith or storagespace. For games... yeah... look at the results there. Quad-Titan Setup to get 60FPS in Metro 2033 or Sleeping Dogs. This is the totally wrong direction imho. We want hardware that can drive 4k-resolutions, but at the same time we all know that energy isn't getting any cheaper. And to have a PC that sucks 1kW while playing a video-game :cough: sorry, but that's just stupid. Leave it at 1080p and get me hardware that allows me to play games like Metro at max settings with only 100-150W for the complete system, i.e. 35W TDP CPU + 75W TDP GPU (75W is the maximum powerdraw over PCIe x16, so no extra PCIe-powerconnector).
Em dont we all like a good brag at what our rigs can do on max settings? AA will always be needed because the human eye easily see the issues without it. Anandtech also just used the same settings they normally use in games testing. What's been missed is the other bits of information that they said. Micro stutter on AMD cf systems and dirt3 been pretty much unplayable due to 30hz screen. 30hz is the big put off to me I play alot of driving and FPS games that would be totally ruined if the textures are all over the place due to screen rip. Still waiting on a 60hz 4k screen for reasonable money could not care less for 4k 30hz its for films and normal tv its not for gaming on.
I was referring to the single card solution rather than Crossfire which we all know is borked. I'm sure that architecture and drivers (for multi card in particular) will need to come along a bit more before 4k gaming becomes popular. Still nice to see the pure horsepower of a single, cheap 7950 at silly resolutions though, even if playability is not a factor.