1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia announces GeForce RTX 2060

Discussion in 'Article Discussion' started by bit-tech, 7 Jan 2019.

  1. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    You don't really need specs as....
    In other words the 'specs' are whatever's read from the monitor and if Nvidia cards are not working correctly with FreeSync/VRR monitors it's because they are either not reading the EDID correctly or setting the range incorrectly in their drivers.
     
    Last edited: 11 Jan 2019
  2. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    The EDID tells you what the monitor reports it can support. That has nothing to do with having minimum performance requirements a monitor must meet.

    More specifically:
    You keep asserting this with no evidence.
     
  3. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    You said yourself there are no performance requirements for Freesync, that means what the monitor reports it can support is what the performance requirement is, how is that so hard to understand?
    Are you being deliberately obtuse? Nvidia have said only 12 out of 400 worked, they even had a stand at CES showing how their cards don't work with some monitors.

    It's simple, if a piece of hardware reports it supports X, Y, and Z then that's the performance requirements of that hardware, if you try running something on it that it doesn't list in it's specs it's probably not going to work.

    If a graphics card reports it only supports DX9 and you try running a DX12 program on it it's not going to work because you ignored the reported requirements. If your monitor or TV reports that it only supports 30Hz and you try sending it a 144Hz signal it's not going to work.

    If a monitor reports its minimum performance requirement is 48Hz and you try sending it a 24Hz signal it's probably going to cause problems, the exact problems Nvidia themselves are telling you about.

    The evidence is right in-front of your face.
     
  4. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    G-sync: there are minimum requirements for refresh rate range, pixel response time, colourspace, etc that a monitor need to confirm to for it to get the G-sync label.
    Freesync: there are zero requirements. Wrong colourspace, ghosting, and a 10Hz VRR range? No problem, still gets the label.

    Not complicated.
    No, they have not. They have certified 12 that meet the minimum specification (that is slightly relaxed from G-sync's specs, e.g. support for refresh rates down to 0 Hz) and have VRR enabled bydefault, and everything else works but must be enabled manually.
    What they have said is they are not guaranteed to work well, but that's because those just don't work well period. A monitor with a 48-60Hz VRR range is not going to gain more range by being on an AMD card vs. an Nviida card, it's just pants right out of the box.
     
  5. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    For you it seems it is, we're not talking about the merits or demerits of either system, we're talking about why Nvidia are having problems with FreeSync monitors when AMD are not.

    Using you're example of "Wrong colourspace, ghosting, and a 10Hz VRR range" if that's what the monitor is presenting as its specs then that's what the graphics card should output, how is that hard to understand?
    Right, so when Nvidia said "Support, however, will be limited to those monitors which have passed Nvidia's internal testing - a mere 12 out of 400 tested so far, the company claims - while power users will be able to force Adaptive Sync on non-certified displays." they were lying then?

    There are no minimum specification, you said so yourself, it doesn't matter if Nvidia have arbitrarily decided that they consider X, Y, or Z should equal A, B, or C because FreeSync doesn't specify that it should.

    I mean seriously how are you not understanding that if something, that you say yourself, doesn't specify what the minimums or maximums should be that it means there are no minimums or maximums to what something should be able to support.

    If a monitor only has a 48-60Hz VRR range then that doesn't mean you can decide that that's not good enough and feed it with 0-120Hz because, guess what, it won't work.

    What Nvidia are doing is the equivalent of saying we should fit 21″ GeForce tires to all wheels because they think GeForce 21″ tires are the best and then claiming that because their GeForce tires don't fit on 17″ MINI wheels that 17″ MINI wheels don't work.
     
    Last edited: 12 Jan 2019
  6. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    No, I'm talking precisely about the differences between G-sync as a standard (with minimum requirements to entry) and Freesync (which does not). This has been known since the genesis of both standards.
    If a monitor states "I support the sRGB colourspace" and does not, there is no magic a GPU can perform to make that so. Not Nvidia, not AMD, not the ghost of 3DFX can make a panel output colours it is not capable of reproducing. Likewise, if a monitor only supports a 48-60Hz VRR interval, there is nothing any GPU can do to expand that range.
    That's Bit-tech editorial, not a statement from Nvidia.

    Once again: All Freesync displays will work (i.e. will output an image the monitor is capable of displaying), not all will be supported as meeting the standards Nvidia have set to be enabled by default. The claim that non-certified displays will 'not work' is one you have manufactured out of whole cloth.
    Nonsense! Anyone can decide it's not good enough. Nvidia decided it's not good enough for the G-sync standard, AMD decided it's not good enough for Freesync 2 (requiring monitors to have a 2x min-max VRR range, e.g. a 60Hz max monitor must support at least 30Hz min), and consumers can decide it is not good enough and not buy that particular monitor.
    Nobody is blocking anyone from making, selling or using such a monitor, it's just not going to have VRR enabled by default on an Nvidia GPU. That's all.
    No, to use a car analogy: Nvidia are saying "We think anything less than a 21" wheel is not worth bothering with because it won;t work well. You can fit a 21" wheel right on, but if you want to fit a smaller wheel you can do so after flipping the switch under the label that says 'this is not going to be great'" at the same time as car reviewers, car buyers, and rival manufacturers are also saying "fitting less than a 21" wheel is not going to be worthwhile".
     
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Then you're the only one as that's not what i said and it's not what Nvidia said.
    Then it's a fault of the monitor for saying it supports something it doesn't, exactly as I've been saying, it's not a fault with the sRGB colourspace, it's not a fault with whatever GPU you connect to it, it's the fault of the monitor reporting something it doesn't.
    Then take it up with Bit-tech editorial.
    If Nvidia are setting their own standards then that's their fault, it's not the fault of whoever makes the monitor, it's not the fault of whoever thought of the idea of FreeSync or VRR, it's the fault of Nvidia for imposing some arbitrary standard, if someone on a forum said i could run my CPU with a vCore of 0.5v it wouldn't be the fault of that person, Intel, my MoBo manufacturer, or anyone other than myself if it didn't work properly, would it?
    Sure you can decide that but you can't say you support something that doesn't have any minimum or maximum requirements and decide to impose your own minimum or maximum requirements because if you do that you're not supporting it, you've altered it, like AMD did, and you need to call it something else so as to differentiate it from the standard.

    Just to be clear Is G-Sync better than FreeSync and FreeSync better than VRR? IMO yes.

    Are their dodgy displays out there that don't deserve to claim that they support VRR/FreeSync? Again Yes.

    However this isn't about what's better or worse, this is about a company claiming they're going to support a standard that doesn't have any requirements and that company imposing their own requirements, that's not supporting a standard that doesn't have requirements, that's making up your own standard, and then worst of all using your made up standard to besmirch the lower requirement standard you based you made up one on.
     
    Last edited: 14 Jan 2019
  8. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,361
    Likes Received:
    965
    :D no, no, no :D You cited that article as evidence of what Nvidia said, so you can't just tell the person arguing against you to take it up with B-T!

    Anyway, don't want to interrupt your massive multi-quote argument, so crack on.
     
    Sentinel-R1 likes this.
  9. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,381
    Likes Received:
    7,215
    I mean, speaking as the Bit-Techer who wrote that (which is a news piece, not an editorial, but that's by the by), I'm literally just paraphrasing what Nvidia said:
    Nvidia has also confirmed that forcing Adaptive Sync may not work:
     
    Last edited: 14 Jan 2019
    Corky42 likes this.
  10. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,361
    Likes Received:
    965
    Corky42 likes this.
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    The DVI standard does not define minimum or maximum resolutions that a GPU must support (beyond a 22.5MHz minimum pixel clock)*. However, there are plenty of resolutions and timings that a GPU will baulk at outputting even if they remain within the pixel clock capability (e.g. high-res slow-scan TV, or an entire website dedicated to getting a specific non-interlaced resolution, or a 1x1 pixel display at 125kHz, etc). Does that mean AMD and Nvidia are "imposing their own requirements, that's not supporting a standard that doesn't have requirements, that's making up your own standard"? Hell, there was enough trouble with needing to manually force Reduced Blanking timings when the first wave of 1920x1200 LCDs came in with single-link DVI inputs!

    When a standard leaves factors undefined, then to avoid nonsensically overbuilt implementations reasonable ranges will be chosen by each implementer, and each implementer will have a different ides of what is reasonable. This is why standards like HDMI have much more rigid definitions of display timings (and why everyone violates the HDMI standard when they use it for desktop monitors rather than AV interconnects).
    Now I know calendars are complicated, but 2013* did indeed occur before 2015.
    * a nice blast from the past in the opening paragraph there too.
    So "we haven't tried them all, so we can't guarantee it'll work" then. That's a far cry from "if it's not certified it's broken".
     
  12. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Seriously, you're bringing DVI into this now?

    The DVI standard includes pins for DDC that reads the blooming monitors capabilities, you're literally trying to defending something with something that proves the point I'm making, that GPUs read the capabilities of the monitor and if they don't display properly despite reading the DDC that it's the fault of the GPU and not the 'standard' exactly as I've been saying.
    When a standard leaves factors undefined they're defined by what you're using, how is that so hard to understand, the X86 standard doesn't included any mention of processing speed does that mean the 8086 is not a X86 CPU because it's really slow? No it doesn't because guess what, X86 doesn't specify the processing speed so any speed is supported.

    Why are you finding it so hard to understand, no matter what your personal thoughts on it are, that if something is not defined then it means anything goes and if you're going to claim to support a standard that leaves something undefined that means it's your fault if you can't run that undefined spec.
    Again with the better, first thing, this isn't about what's better or who came first, it's about claiming to support a standard, in this case FreeSync/VRR, and because you either can't or won't support it fully using that fact as a stick to beat the standard you're claiming to support over the head

    This isn't about G-Sync, it's about Nvidia's claim to be supporting FreeSync/VRR and then claiming there are problems with FreeSync/VRR when the problem is that they either can't or won't support it fully.

    Ask yourself this, do you think AMD cards can support every VRR monitor on the market? If the answer is no then ask yourself why they came up with FreeSync, if the answer is because they wanted to define what range of VRR their cards could work with then ask yourself why Nvidia have not done the same.
     
    Last edited: 14 Jan 2019
  13. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Hence why I included the example of 1920x1200 Reduced Blanking over single-link: a situation where EDID alone was not sufficient to produce a suitable output, because that entered an area of the standard that was insufficiently defined.
     
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    So just so I'm clear, you're saying that because the Digital Display Working Group did not define those things that they're at fault because something they didn't define doesn't work properly?
     
  15. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    I'm saying that when things are insufficiently defined, "just support absolutely anything potentially possible" is not going to be the route any sane developer goes down (at least, not one who works in the same organisation as at least one accountant). It's not even something AMD are immune to: the Asus MG279Q has issues with Fury cards (supposedly fixed in a hardware revision at some point) but not Polaris or Vega, the ROG XG27VQ has Polaris problems (with one guy on the Techreport even forums even reporting that it has issues with his 580 but works fine on a 1070), etc.
     
  16. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    So you're saying the DDWG, Vesa and AMD are insane?

    And what would you define as 'insufficiently defined' exactly?
     
  17. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    There you go trying to put words in my mouth again.
     
  18. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,425
    Likes Received:
    447
    I have so much love for this thread. I'm going to have to buy more popcorn!
     
    edzieba and Corky42 like this.
  19. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    No, I'm asking you a question, that's what the question marks were for, I'm asking you if you're implying that only insane developers go down the route of insufficiently defined when you say "insufficiently defined is not going to be the route any sane developer goes down", and I'm asking you what you consider insufficiently defined.

    If sane developers do not go down the route of insufficiently defined standard or specs then you're implying it's something only insane developers would do, and since you're making a distinction between sane and insane developers i thought it would be a good idea to know what's considered insufficiently defined.
     
    Last edited: 17 Jan 2019
  20. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Sanity has nothing to do with it (hence putting words into my mouth).

    Think of it as a tradeoff triangle: "flexibility of standard", "guarantee of interoperability" and "cost of implementing standard", you can only pick 2.

    If you pick a loosely defined standard for flexibility and low-cost implementation, it is at the cost of guaranteed interoperability. e.g. DVI
    If you pick a guaranteed interoperability and low-cost implementation, at the cost of flexibility, you get HDMI: guaranteed support of a specific subset of timings, resolutions, colour levels, etc. *
    If you pick a loosely defined standard and guaranteed interoperability, at the cost of high implementation cost, you get G-sync. You use a monitor agnostic output (no monitor-specific pixel-level fiddling for overdrive, no driver-side frame duplication, all colourspace and FALD driving done monitor-side, etc) but the panel controller to do so is a rather expensive FPGA.

    *Incidentally, most of the problems with HDMI as a PC monitor interconnect is because it was never intended for this and ends up as operation out of spec. 'Random' issues with TV levels being set, '1080p' monitors getting stuck with overscan or 'weird' refresh rates, etc, are down to HDMI's intended purpose as an AV interconnect colliding with its abuse for monitors. Friends don't let friends use HDMI on the desktop.
     
Tags: Add Tags

Share This Page