1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia announces GeForce RTX 2060

Discussion in 'Article Discussion' started by bit-tech, 7 Jan 2019.

  1. bit-tech

    bit-tech Supreme Overlord Lover of bit-tech Administrator

    Joined:
    12 Mar 2001
    Posts:
    3,676
    Likes Received:
    138
    Read more
     
    LennyRhys likes this.
  2. Omnislip

    Omnislip Minimodder

    Joined:
    31 May 2011
    Posts:
    629
    Likes Received:
    155
    Presumably at 1024x768?

    Good news on adaptive sync though! I'd be very upset if I'd just bought a G-Sync monitor.
     
  3. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,066
    Likes Received:
    6,610
    Now, this is interesting: Nvidia has a footnote about the "faster than the 1060" bit, which is talking about non-ray-tracing performance, which cites a resolution of 2,560x1,440 - but it does not have a matching footnote for the "60FPS in Battlefield V with ray tracing" bit. So, yeah, probably.
     
  4. Dogbert666

    Dogbert666 *Fewer Lover of bit-tech Administrator

    Joined:
    17 Jan 2010
    Posts:
    1,678
    Likes Received:
    181
    Founders Edition reviews will go live today but sadly we have yet to receive our sample. Will get it out ASAP.
     
  5. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,058
    Likes Received:
    969
    Please try to verify those Freesync supported claims in the announcement and if it works put it left, right and centre as it is the biggest news in PC hw in a long time.
     
  6. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
    £330 for a mid-range part... And a higher TDP than a GTX 1070.

    No impressed. Not impressed at all. Hopefully the rumour of a "GTX 2060" for £250ish is true.
     
  7. LennyRhys

    LennyRhys Fan Fan

    Joined:
    16 May 2011
    Posts:
    6,395
    Likes Received:
    883
    Dunno...I think that £250 for a card that sits somewhere between the 1070 and 1080 is a little unrealistic. It may be a "mid-range" part on paper, but the performance numbers it's churning out are hella impressive when stacked against the price/performance of previous gen hardware. Also, £250 seems even more fanciful when you consider that the 2070 is pitched north of £450.

    I've been contemplating my next GPU upgrade for some time now, and having had a 980Ti before (well, three actually...) I'd rather go for something a little punchier that won't break the bank. Seems that the 2060 fits the bill rather nicely indeed.
     
  8. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Something i don't understand from watching a video about G-Sync vs FreeSync and their claim that 12 out of 400 tested monitors passed muster: If the monitors they tested were all FreeSync/Adaptive Sync certified and only 12 passed does that mean there's something wrong with the way Nvidia implimented FreeSync?

    If those monitors work without blurring and/or flickering when using an AMD card but do when using an Nvidia card that must mean there's something wrong with how they implemented it, no?
     
  9. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    G-sync puts all the onus on the panel controller to take an image and display it properly (no ghosting, no juddering, doubling frames when the inter-frame interval exceeds threshold, dealing with colourspace for HDR, etc). Freesync requires the GPU to do all the work. That also means for Freesync that the GPU-side implementation needs to be tuned to the quirks of that particular panel + panel controller combination or you'll get a crap output. For G-Sync, an unbiased output should be displayed in the same (barring panel variance) on any G-sync monitor, no GPU-side tuning required.

    But mostly, it's Youtube Guy making an unfounded assumption: the 12 monitors are just the 12 validated so far, not some proclamation that these are the only 12 models in existence that pass muster. Given the huge tide of Freesync monitors to test (since everyone just flipped the VRR checkbox in newer panel controller firmware regardless of actual capability because it gives an extra 'feature' to slap on the box) its going to take a while to actually test everything, even assuming Nvidia are doing so proactively rather than waiting for monitor makers to submit models for testing.

    There is no certification for Freesync, only Freesync 2 requires monitors to meet any sort of minimum standard.
     
  10. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Well AMD says "with AMD proudly pointing to its ever-growing FreeSync ecosystem and a total monitor count that now exceeds 550" so you'd assume if those 550 monitors work with FreeSync that Nvidia's cards would also work, unless they're purposefully not testing those monitors.
     
  11. Shirty

    Shirty W*nker! Super Moderator

    Joined:
    18 Apr 1982
    Posts:
    12,930
    Likes Received:
    2,058
    If they've already tested 400 of the 550 available, and only 12 have passed muster, then I wouldn't be confident that the list is going to get much bigger...
     
  12. Wakka

    Wakka Yo, eat this, ya?

    Joined:
    23 Feb 2017
    Posts:
    2,117
    Likes Received:
    673
    From Anandtech review:

    So the GTX 1060 had a considerably higher generational performance jump, for a smaller MSRP increase. I guess most of that is AMD's lack of contribution to the market, though...
     
  13. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Nvidia have stated that 'certified' monitors will default to VRR on connection, every other monitor will be a manual settings-flip (i.e. default-off, user choice).
     
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Well if Nvidia are saying that 'certified' monitors will default to VRR then it should work without problems on the 550 FreeSync monitors, should it not? That is unless we're saying VRR monitors are more demanding than FreeSync monitors and AMD cards also have problems with VRR.
     
  15. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Work != work without problems. For many, no about of magic-wand-waving will solve those problems (if the refresh rate range is 48-60Hz, there's SFA the GPU can do about it). Nvidia established minimum performance requirements for G-Sync, and are unwilling to relax those for Freesync. Any monitors that do not meet those requirements do not get enabled by default, and move into "sure, you can do that if you want..." territory. The 12 certified monitors are those certified so far, but that list will expand.
     
  16. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    It's not a matter of if the list will expand, it's that AMD say there's 550 FreeSync monitors and Nvidia saying it will begin supporting FreeSync but apparently their cards have problems with monitors one would assume AMD cards do not have problems with.

    That is unless they've been really dumb and tested 400+ monitors that don't support FreeSync.
     
  17. MLyons

    MLyons 70% Dev, 30% Doge. DevDoge. Software Dev @ Corsair Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    3 Mar 2017
    Posts:
    4,174
    Likes Received:
    2,732
    The specs for G-Sync != the specs for FreeSync. That's where the 12 from 400 comes from. 12 from the 400 meet the FreeSync spec and also happen to meet the G-Sync specs
     
  18. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Not wanting to split this across two threads it seems not. :confused:
     
  19. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    In addition, there is no Freesync spec. You could pop out a 720p monitor with a VRR range of 29Hz to 30Hz using a TN panel and a backlight that doesn't meet sRGB and with a well-off-D65 whitepoint, but it does indeed support some degree of VRR so it is still a Freesync monitor. If you think that's hyperbole go look at some of the 'Freesync capable' portable monitors available via Taobao/Alibaba/etc, and compare their purported specs with what people are actually able to achieve with them in practice.
    On the other hand, the cost of the G-Sync panel controller is high enough that there's no sense pairing it with anything other than a high-end panel (because using a budget panel will not result in a budget monitor), so the minimum performance requirements for G-Sync are not only existent, but pretty high.

    The default whitelist exists for the intersection of the two: when a Freesync monitors happens to meet (or close enough, barring some G-sync controller specific things like pixel modulation timings) the G-sync performance specs it gets to be enabled automatically. Everything else has VRR off by default (either until that model is tested, or never if it is not up to snuff) with the user able to enable it with the advisory "you can do that, but it won't be as good as one of these 'proper' ones..." because, well, a monitor with a VRR range of 48Hz-60Hz won't be as good as one with a range of 30Hz-144Hz (or 0-144Hz if you count the built-in fame duplicator for long inter-frame intervals).
     
  20. wolfticket

    wolfticket Downwind from the bloodhounds

    Joined:
    19 Apr 2008
    Posts:
    3,555
    Likes Received:
    646
    Is there anything to stop existing manufacturers from dropping the G-Sync branding and hardware (which they presumably pay for) under the assumption that the same models will how be considered G-Sync compatible by Nvidia anyway?
     
Tags: Add Tags

Share This Page