1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Nvidia G-Sync Review

Discussion in 'Article Discussion' started by Meanmotion, 23 Dec 2013.

  1. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,652
    Likes Received:
    19
  2. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Just think if the big three could actually work together and share some technological advancements, i cant help thinking all this proprietary stuff like Mantel, G-Sync, etc is holding back wide spread adoption of some advancements to the determent of us end users.
     
  3. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    AMD claims Mantle is open, but I have yet to see any docs, so as far as I'm concerned it's proprietary, plus it will be very heavily optimised for AMD GPUs in its first incarnation at least, so it will take time for Nvidia to catch up. Add to that that Nvidia is very, very confident in it's OpenGL implementations and you can see why they have been generally apathetic so far.

    G-Sync, while proprietary to Nvidia right now, should be fairly easy for AMD to implement as well should Nvidia open the spec: It uses a standard DisplayPort interface together with an off-the-shelf FPGA (running admittedly very custom logic) in the monitor. I do believe Nvidia will license it fairly soon, much like it eventually licensed SLI to non-nvidia chipsets back in the day. If that fails, AMD will implement their own version, and then after a generation or two of feet-dragging, they'll make it a damn standard and we'll all move on.

    All in all, there's nothing so proprietary that it can't be opened up right now. Just give them time to make money of exclusivity and then watch the dust settle.
     
  4. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    Nvidia will never open up anything. They claw onto these little tech tidbits with their corporate grip.

    They could have opened Physx ages ago but no. It's a shame because Phsyx could have been so much more than just a game every year or so. AMD opened up TressFX, heck, it even ran better on my GTX 670s in SLI than my 7990.

    They're also assholes when it comes to 3dvision. I bought a TV recently that does Active 3D. Wired it up to my 670 SLI rig all excited, then found out they want $30 for the privilege of using it on my bloody telly.

    So I downloaded the demo and it was locked to 26 FPS (even the full version is). So instead of opening up 3dvision and letting companies like, for example, Toshiba (who make my set) implement it for the full 60hz they instead charge $30 for it and it's derped.

    The only thing Nvidia open up is Jen-Hsun Huang's asshole so he can fart once in a while.

    It's a shame really, because if they were less focused on greed these silly little tech tidbits they come out with once in a while could actually be something good, instead of something you have to pay for (hint - you don't bother).
     
  5. Deders

    Deders Modder

    Joined:
    14 Nov 2010
    Posts:
    4,053
    Likes Received:
    106
    What I want to know is is it going to be any smoother than simply enabling Triple buffering (for free via D3DOveridder, which i realise only seems to work on 32bit exe's), and would the difference be worth paying the £100 plus buying a new monitor that may have worse attributes than the one I already own.

    I monitor the frametime amongst other thing in game with rivatuner statistics server and from what I can see D3DOverrider does the same job.

    I can see how this may benefit online gamers with a twitch style of playing when compared to Triple Buffering, if it is just as responsive as not having Vsync enabled.
     
  6. law99

    law99 Custom User Title

    Joined:
    24 Sep 2009
    Posts:
    2,390
    Likes Received:
    63
    I guess this is something we'll need to use to really appreciate. Mostly as I look at the vsync on video and think it is good enough. Same as lots of things I suppose. Thanks for the write up.

    Really would be a shame if they can't let AMD in on this one. Unless there is going to be a plug-able adapter for relevant cards in an easy to access port on supporting monitors. Who knows. Companies want to make money... it is the point. :s
     
  7. AlienwareAndy

    AlienwareAndy What's a Dremel?

    Joined:
    7 Dec 2009
    Posts:
    3,420
    Likes Received:
    70
    FTFY
     
  8. Maki role

    Maki role Dale you're on a roll... Lover of bit-tech

    Joined:
    9 Jan 2012
    Posts:
    1,724
    Likes Received:
    151
    To be honest, at only 1080p this was never going to do much. Whilst it's currently very much the mainstream resolution, it's certainly not the target for those looking at that budget sector. This is especially the case given how strong current cards are when running at 1080p, it's fast becoming an old standard at the high-end of the market.

    However, this would be so useful for 4k and other very high resolution setups. There, even top end cards can be crippled by newer games, especially in multiple GPU setups where the frame rate can be very variable. Adding another £100 to a 4k display is neither here nor there if the experience will improve as much as mentioned (given how 30-60 fps will be far more common and how random dips can occur with SLI/Crossfire).

    Adding £100 to the monitor is much cheaper than adding a third 780/Titan/780Ti or 290/290x, plus it'll remain relevant hopefully for quite a while. Not to mention I would imagine the technology will become much cheaper with greater adoption down the line.
     
  9. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    The hard part about G-sync isn't the protocol (it fiddles with the VBLANK interval, whoopee), but creating the display panel controller hardware. As long as AMD aren't totally inflexibly with their display driver design (and by this I mean 'the chips that drive the physical display outputs', not a software driver), they shouldn't have too much trouble implementing G-Sync in their next cards. Creating monitors that would actually be able to do anything with the signal? That's the hard part, and where Nvidia have put their research.

    Once the controller is mated with one of those 2560x1440 IPS panel bare-bones monitors I'll be snapping one up. My main beef wth it is the current iteration is designed solely to drive LVDS panels, which rules out its use in the first consumer iteration of the Oculus Rift, which will have to use a MIPI DSI panel because nobody makes small highDPI LVDS panels .
     
  10. ch424

    ch424 Design Warrior

    Joined:
    26 May 2004
    Posts:
    3,112
    Likes Received:
    41
    I'd have thought it's more interesting to compare G-Sync to a 144Hz standard monitor with V-sync enabled, no? In both cases you have to spend a lot of money, so it would be interesting to see if G-Sync makes gameplay significantly better than a regular 144Hz monitor. Did you guys play with that at all?
     
  11. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,652
    Likes Received:
    19
    We've admittedly done limited running at 144Hz and will be adding a few more thoughts on that scenario shortly. But we chose 60Hz as the main point of comparison because the vast majority of monitors run at that frequency, and particularly because all IPS monitors do. This simply gives a better overall impression of what the technology will do for most users. Spending most of our time looking at the niche of 144Hz monitors would give a very skewed sense of the value of the technology for most people.
     
  12. ch424

    ch424 Design Warrior

    Joined:
    26 May 2004
    Posts:
    3,112
    Likes Received:
    41
    Sorry, I didn't quite make my point clear - if "most people" have a standard 60Hz screen now, should they pay the little extra to upgrade to 144Hz, or should they pay even more extra to get G-Sync? Is it worth the difference?
     
  13. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,652
    Likes Received:
    19
    Ah, I see what you're getting at. The logic there was that 144Hz (or other high frame rate) monitors have been around for some time, so although it's not something bit-tech has covered extensively, the arguments for and against have long since been made. What are those arguments? Well, TN gives you fast framerates, which are beneficial in competitive gaming, but poor image quality compared to IPS panels. We're generally of the opinion that going with quality is better overall.
     
  14. GregTheRotter

    GregTheRotter Minimodder

    Joined:
    9 Aug 2008
    Posts:
    4,271
    Likes Received:
    88
    Anyone know where the kits will be sold and for how much?
     
  15. Yslen

    Yslen Lord of the Twenty-Seventh Circle

    Joined:
    3 Mar 2010
    Posts:
    1,966
    Likes Received:
    48
    I think this review covers most of the points nicely; expensive monitor, potentially an expensive GPU upgrade to see results, and you lose the benefits of whatever monitor you have now for a similar or lower price, be it 2560x1440, IPS or whatever.

    I'm interested in G-sync, but give it a few years, I think.
     
  16. Bede

    Bede Minimodder

    Joined:
    30 Sep 2007
    Posts:
    1,340
    Likes Received:
    40
    Forgive my ignorance on screen tech, but I find it astonishing that VBLANK is still a thing for non-CRT screens. Anyone know why it still exists?
     
  17. Cthippo

    Cthippo Can't mod my way out of a paper bag

    Joined:
    7 Aug 2005
    Posts:
    6,785
    Likes Received:
    103
    Probably because so many monitors still have VGA ports and once something gets into a standard it's impossible to get rid of it.

    In this case, it turned out to be usable for something else, so win!
     
  18. iggy

    iggy Minimodder

    Joined:
    24 Jun 2002
    Posts:
    1,029
    Likes Received:
    12
    not interested at anything less than 120hz. just figured out how to get the projector running at the full whack, now its just like the old crt days, nice lag free, non-jittery 60 fps+ gaming. lets consign the 60hz nonsense to the consoles where it belongs eh?
     
  19. alialias

    alialias What's a Dremel?

    Joined:
    1 Dec 2010
    Posts:
    59
    Likes Received:
    2
    Seems like a logical move to me, the vast majority of things in computers are synchronised by clocks, why not the monitor too?
     
  20. ZeDestructor

    ZeDestructor Minimodder

    Joined:
    24 Feb 2010
    Posts:
    226
    Likes Received:
    4
    This is actually the opposite: Nvidia is varying the refresh rate (clock) of the monitor panel!
     
Tags: Add Tags

Share This Page