So, I have a relatively strong comprehension of how the technology behind shutter-based stereoscopy works, and have been waiting for the technology to hit the consumer level for awhile now. But I'm still curious- if the source output is not a steady 120FPS, don't we just return to the flickery madness of past stereoscopic hardware solutions (well, to some extent anyways)? I know, I know, <120FPS with a 120Hz display, it's still better than the 60Hz/30FPS of most panels on the market. I recently bought a 25.5" TN for gaming purposes, with a plan to buy a 24" or 30" IPS for photo editing down the road. But I'm now considering returning the display I just bought and ordering a WSXGA+ 120Hz panel and 3DVision. At that resolution, with my current hardware, I could pull around 100FPS in most/some games with a bump down in settings- but that's still not enough to pull 60FPS/eye. Should I wait, upgrade my hardware, sell my panel later (bleh) and at that point go for 3DVision, or am I missing something here. I couldn't possibly think Nvidia would expect everyone to be able to output modern games at a futuristic frame rate to make use of this awesome tech. Or then again, I honestly wouldn't be all that surprised...
Read this Anandtech article first. It's not just that the graphics power isn't there for modern games just yet - it's also that game support is pretty limited. Wait it out. In two or three years nVidia might have their support down, and LCD quality will have jumped immensely (120Hz and LED backlighting will likely be common and relatively cheap. You might as well wait for the monitor too). - Diosjenin -