Discussion in 'Article Discussion' started by bit-tech, 7 Jan 2019.
Presumably at 1024x768?
Good news on adaptive sync though! I'd be very upset if I'd just bought a G-Sync monitor.
Now, this is interesting: Nvidia has a footnote about the "faster than the 1060" bit, which is talking about non-ray-tracing performance, which cites a resolution of 2,560x1,440 - but it does not have a matching footnote for the "60FPS in Battlefield V with ray tracing" bit. So, yeah, probably.
Founders Edition reviews will go live today but sadly we have yet to receive our sample. Will get it out ASAP.
Please try to verify those Freesync supported claims in the announcement and if it works put it left, right and centre as it is the biggest news in PC hw in a long time.
£330 for a mid-range part... And a higher TDP than a GTX 1070.
No impressed. Not impressed at all. Hopefully the rumour of a "GTX 2060" for £250ish is true.
Dunno...I think that £250 for a card that sits somewhere between the 1070 and 1080 is a little unrealistic. It may be a "mid-range" part on paper, but the performance numbers it's churning out are hella impressive when stacked against the price/performance of previous gen hardware. Also, £250 seems even more fanciful when you consider that the 2070 is pitched north of £450.
I've been contemplating my next GPU upgrade for some time now, and having had a 980Ti before (well, three actually...) I'd rather go for something a little punchier that won't break the bank. Seems that the 2060 fits the bill rather nicely indeed.
Something i don't understand from watching a video about G-Sync vs FreeSync and their claim that 12 out of 400 tested monitors passed muster: If the monitors they tested were all FreeSync/Adaptive Sync certified and only 12 passed does that mean there's something wrong with the way Nvidia implimented FreeSync?
If those monitors work without blurring and/or flickering when using an AMD card but do when using an Nvidia card that must mean there's something wrong with how they implemented it, no?
G-sync puts all the onus on the panel controller to take an image and display it properly (no ghosting, no juddering, doubling frames when the inter-frame interval exceeds threshold, dealing with colourspace for HDR, etc). Freesync requires the GPU to do all the work. That also means for Freesync that the GPU-side implementation needs to be tuned to the quirks of that particular panel + panel controller combination or you'll get a crap output. For G-Sync, an unbiased output should be displayed in the same (barring panel variance) on any G-sync monitor, no GPU-side tuning required.
But mostly, it's Youtube Guy making an unfounded assumption: the 12 monitors are just the 12 validated so far, not some proclamation that these are the only 12 models in existence that pass muster. Given the huge tide of Freesync monitors to test (since everyone just flipped the VRR checkbox in newer panel controller firmware regardless of actual capability because it gives an extra 'feature' to slap on the box) its going to take a while to actually test everything, even assuming Nvidia are doing so proactively rather than waiting for monitor makers to submit models for testing.
There is no certification for Freesync, only Freesync 2 requires monitors to meet any sort of minimum standard.
Well AMD says "with AMD proudly pointing to its ever-growing FreeSync ecosystem and a total monitor count that now exceeds 550" so you'd assume if those 550 monitors work with FreeSync that Nvidia's cards would also work, unless they're purposefully not testing those monitors.
If they've already tested 400 of the 550 available, and only 12 have passed muster, then I wouldn't be confident that the list is going to get much bigger...
From Anandtech review:
So the GTX 1060 had a considerably higher generational performance jump, for a smaller MSRP increase. I guess most of that is AMD's lack of contribution to the market, though...
Nvidia have stated that 'certified' monitors will default to VRR on connection, every other monitor will be a manual settings-flip (i.e. default-off, user choice).
Well if Nvidia are saying that 'certified' monitors will default to VRR then it should work without problems on the 550 FreeSync monitors, should it not? That is unless we're saying VRR monitors are more demanding than FreeSync monitors and AMD cards also have problems with VRR.
Work != work without problems. For many, no about of magic-wand-waving will solve those problems (if the refresh rate range is 48-60Hz, there's SFA the GPU can do about it). Nvidia established minimum performance requirements for G-Sync, and are unwilling to relax those for Freesync. Any monitors that do not meet those requirements do not get enabled by default, and move into "sure, you can do that if you want..." territory. The 12 certified monitors are those certified so far, but that list will expand.
It's not a matter of if the list will expand, it's that AMD say there's 550 FreeSync monitors and Nvidia saying it will begin supporting FreeSync but apparently their cards have problems with monitors one would assume AMD cards do not have problems with.
That is unless they've been really dumb and tested 400+ monitors that don't support FreeSync.
The specs for G-Sync != the specs for FreeSync. That's where the 12 from 400 comes from. 12 from the 400 meet the FreeSync spec and also happen to meet the G-Sync specs
Not wanting to split this across two threads it seems not.
In addition, there is no Freesync spec. You could pop out a 720p monitor with a VRR range of 29Hz to 30Hz using a TN panel and a backlight that doesn't meet sRGB and with a well-off-D65 whitepoint, but it does indeed support some degree of VRR so it is still a Freesync monitor. If you think that's hyperbole go look at some of the 'Freesync capable' portable monitors available via Taobao/Alibaba/etc, and compare their purported specs with what people are actually able to achieve with them in practice.
On the other hand, the cost of the G-Sync panel controller is high enough that there's no sense pairing it with anything other than a high-end panel (because using a budget panel will not result in a budget monitor), so the minimum performance requirements for G-Sync are not only existent, but pretty high.
The default whitelist exists for the intersection of the two: when a Freesync monitors happens to meet (or close enough, barring some G-sync controller specific things like pixel modulation timings) the G-sync performance specs it gets to be enabled automatically. Everything else has VRR off by default (either until that model is tested, or never if it is not up to snuff) with the user able to enable it with the advisory "you can do that, but it won't be as good as one of these 'proper' ones..." because, well, a monitor with a VRR range of 48Hz-60Hz won't be as good as one with a range of 30Hz-144Hz (or 0-144Hz if you count the built-in fame duplicator for long inter-frame intervals).
Is there anything to stop existing manufacturers from dropping the G-Sync branding and hardware (which they presumably pay for) under the assumption that the same models will how be considered G-Sync compatible by Nvidia anyway?
Separate names with a comma.