1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News DisplayPort gets G-Sync like Adaptive-Sync tech

Discussion in 'Article Discussion' started by Gareth Halfacree, 13 May 2014.

  1. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,129
    Likes Received:
    6,717
  2. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Forgive the ignorance, but is Adaptive-Sync identical to Nvidia's G-Sync.
    If so i guess that means both the GPU and Monitor needs to support 1.2a
    Reading some more it seems it will work with AMD FreeSync cards, do all AMD cards support FreeSync ?
     
  3. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    To take advantage of the benefits of Project FreeSync, users will require: a monitor compatible with DisplayPort™ Adaptive-Sync, a compatible AMD Radeon™ GPU with a DisplayPort™ connection, and a compatible AMD Catalyst™ graphics driver. AMD plans to release a compatible graphics driver to coincide with the introduction of the first DisplayPort™ Adaptive-Sync monitors.

    The first discrete GPUs compatible with Project FreeSync are the AMD Radeon™ R9 290X, R9 290, R7 260X and R7 260 graphics cards. Project FreeSync is also compatible with AMD APUs codenamed “Kabini,” “Temash,” “Beema,” and “Mullins.” All compatible products must be connected via DisplayPort™ to a display that supports DisplayPort™ Adaptive-Sync.

    AMD has undertaken every necessary effort to enable Project FreeSync in the display ecosystem. Monitor vendors are now integrating the DisplayPort™ Adaptive-Sync specification and productizing compatible displays. AMD is working closely with these vendors to bring products to market, and we expect compatible monitors within 6-12 months.

    Taken from the forbes.com website.
     
  4. DbD

    DbD Minimodder

    Joined:
    13 Dec 2007
    Posts:
    519
    Likes Received:
    14
    Remember this is just to enable the variable frame syncing stuff over display port, it already works on plenty of laptop displays. If freesync is good to go and just requires compatible monitors then why don't all those laptop users get to use it? It was telling that the only demo we've ever seen of this was a video basically running at a fixed fps (if a non standard one), not real games.

    The nvidia solution has an add in chip with a big memory buffer in the monitor. It may well be that firstly the enabling of this basic variable syncing support doesn't equal the sort of control that the nvidia chip gets as it replaces controlling chip in the monitor itself, and secondly the buffer and the processing that nvidia chip does can't just be done by any standard gpu - you might need a new range of gpu's with a little sync control chip in them.
     
  5. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    My question is will nvidia adopt this. Or will they insist on you using a gsync enabled monitor.

    Can this do away with gsync entirely or as was pointed above does gsync offer more.
     
  6. ch424

    ch424 Design Warrior

    Joined:
    26 May 2004
    Posts:
    3,112
    Likes Received:
    41
    This is exactly what I thought would happen

    Gsync is massively over-engineered. It looks like nvidia are fully aware of this because their gsync board is based on a (very expensive!) FPGA and they've not moved to a custom ASIC or worked with TI/NXP to produce it for them.

    Building a flexible refresh rate into the displayport spec will hopefully mean we all get it soon enough - on AMD and Intel's integrated graphics too, even if nvidia snub it!
     
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    It's very unlikely that Nvidia will snub it seeing as they are one of the members of VESA, who are responsible for the displayport standard, and IIRC Nvidia said that one of the reasons for G-Sync was to push the adoption of a variable refresh rate standard.

    Display vendors need something to kick them into actually making better displays.
    http://techreport.com/news/26451/freesync-approved-adaptive-sync-added-to-displayport-spec
    Also of note in the techreport article is that they expect the new Adaptive-Sync standard will take from 6-12 months before we can get our grubby little mitts on them :miffed:
     
    Last edited: 14 May 2014
  8. jb0

    jb0 Minimodder

    Joined:
    8 Apr 2012
    Posts:
    555
    Likes Received:
    93
    Actually, thinking about it it makes SENSE that nVidia needed a whole bunch of RAM since they WERE replacing the entire existing control circuit.
    I doubt they really needed as much as they had, but... it was likely easier to get an FPGA with the capabilities they needed and a gig of RAM than one with the capabilities they needed and exactly the right amount of RAM.
    (And why are they still using an FPGA? Haven't they had more than enough time to lay out a masked ASIC for a few of the more common panels by now?)


    But once Adaptive Sync is in the market place, you SHOULD see exactly the kind of control nVidia offers with GSync, and without sacrificing all the existing functionality of your monitor.
    I don't believe there was ever a good reason to push a proprietary technology to do something there was already a VESA spec for other than maximizing profits, and I'm glad that AMD and VESA are shutting them down on that front, especially with how long GSync has taken to come to market.
     
  9. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions.

    Reading between the lines it seems to me that Nvidia used G-Sync to show the companies responsible for developing ASICs how it can, or should be done on the hardware front. And at the same time AMD showed the other board members of VESA how to implement it into a new display standard, and made the software freely available for everyone to use.

    Together they have shown the other board members of VESA, and the companies responsible for developing the ASICs that there is demand for variable frame rates beyond just using it to save battery life in portable devices.
     
  10. Saivert

    Saivert Minimodder

    Joined:
    26 Mar 2005
    Posts:
    390
    Likes Received:
    1
    G-Sync was basically the proof of concept that the world needed.
     
  11. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    If you were a business why would you not want your device in the back of every monitor.

    Nvidia didn't show anyone anything. It's not like they licensed the gsync module or the fpga source code for a third party to build (in an ARM like fashion)

    Nvidia were the first to create a fuss over variable frame rate and AMD followed suit by realising they could modifying existing tech to do something similar.

    If either company could go it alone they would. I mean come on. They're businesses. They're sole purpose is to make money. Any benefits to anyone else are merely coincidental.

    Just like physx and mantle one side is not interested in the others technology beyond its competitive threat.

    There's no way this was some sort of team effort to introduce variable frame rate graphics.
     
  12. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I wouldn't necessarily want my device in the back of every monitor if i could get the companies that already make those devices to adopt a new technology, as i have been asking them to do for the last 5-10 years.

    Nvidia showed that there's demand for variable frame rates beyond just using it to save battery life, they demonstrated to the current manufactures of LCD controller boards that they need to actually produce new products, and not just use the same ASICs that they have been using for the last 10-15 years, expecting everyone to make do with what they are given, like fixed refreshes, no true 4k instead relying on upscaling.

    Both AMD & Nvidia had been working on variable frame rates a long time before we got to hear about it, they had been telling the manufactures of LCD controller boards that it was something they wanted for years, but the manufactures of the ASICs wouldn't play ball. Much in the same way that develops had been telling Microsoft for years that they wanted changes in DirectX but were ignored, so AMD developed and released their own API that provided what they had been asking for.

    You are correct that both companies sole purpose is to make money, but making money is very difficult when other companies you rely on refuse to innovate, instead churning out the same old tech because it cost money to develop new ASICs, or to develop a new DirectX.

    Why would anyone want to buy your latest and greatest Graphics card if what it plugs into isn't able to show it of to it's full potential, why would anyone buy your Graphics cards if they are being crippled by a bad API.

    EDIT: I found this article from the Tech report an interesting read on the subject, and if you have the time they also wrote another article from the other side of the fence.
     
    Last edited: 23 Jul 2014
  13. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    Right, so instead of selling/licensing something useful (and easy to integrate with your own components) to companies who can't or as you suggest won't develop it. You would rather they develop something similar themselves. Which would force you to work to their standard rather than the other way around (which is more difficult for you) and you make absolutely nothing from it? Sounds like an excellent business plan :thumb:

    Bottom line its going to be less work and more profitable for you to create something for all monitor manufacturers to put in their monitors, than it is to have 10 different monitor manufactures come up with 10 different ways of doing the same thing and you have to comply with all of them.

    Who's been asking for variable refresh rate for the last 10 years?

    How do you know either side have been asking display manufacturers for variable refresh rates for years? How do you know the monitor companies refused to innovate? Which monitor companies specifically are you talking about?
     
  14. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Maybe i missed something but has NVidia said it's not going to license G-Sync at anytime in the future ?

    Good job on explaining what G-Sync is. A module that monitor manufacturers put in their monitors and pay NVidia for the privilege of doing so. :thumb:

    Well first off i said 5-10 years, although granted that was a guesstimate on my behalf.
    I based that guesstimate on the fact that LCD's have been using fixed refresh rates that are a throwback to the days of CRT's and laptops have not, that it takes time to develop something like a G-Sync module.

    Take my guesstimate or leave it, either way these thing don't happen overnight and probably wouldn't be undertaken unless you have no other choice (have been asking for a good length of time)

    I don't know for sure, but do you think NVidia & AMD would invest the money and time in developing something if they didn't have to ?

    I'm guessing by your questions that you didn't read the links i provided.
    Monitor companies don't make everything inside the monitor you know, look towards the companies that make the scaler and ASIC chips.
     
  15. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    Whoever puts a gsync chip in the monitor is going to have to pay nvidia, either for the chip or some sort of license fee. There's a reason those monitors are more expensive


    I was explaining why your idea of getting ASICs or monitor companies to develop a variable refresh rate technologies for you instead of doing it yourself is not necessarily a good idea. Its less profitable and makes life more complicated for you. Which all goes back to your point on neither company intended to go it alone. When it makes sense to go it alone because it can be more profitable and less complex.

    Grand, I'll leave it.


    No one would invest time and money if they didn't envisage a payback from it.

    I read both of them when they were released. I view the techreport daily. Neither article was pertinent to your points. There are now 4K 60hz panels so someone somewhere must be innovating.



    No problem just replace my question mwith monitor companies and their suppliers.
     
  16. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Sorry, but wasn't it you that said what kind of business model is that ?

    IIRC I didn't say either company intended to go it alone, in fact i think i said "I'm not sure either AMD or Nvida truly planned to go it alone with their own solutions." :confused:

    Your choice, i was merely putting forward my theory. You are welcome to come up with your own and share it with us.

    But wasn't it you that said "If you were a business why would you not want your device in the back of every monitor." Ohh yea you did, right here.

    What you mean the point that i made, that the makers of display scaler and control chips have been rather sluggish in developing ASICs, as i quoted from the article.

    The maybe you need to E-mail Scott Wasson and ask him what companies he was referring to when he said...

    "The makers of display scaler and control chips still have work to do, and those companies have been rather sluggish in developing ASICs capable of supporting 4K resolutions at 60Hz. (That's why a lot of the early 4K displays use dual tiles, which is way less than optimal.)"
     
  17. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    You said you wouldn't want your chip in the back of every monitor. Everything I have wrote is based upon why I think that notion is bad for business. And because having your chip in every machine is good for business, it also makes sense that you would want to go it alone as a company and not be part of some AMD/Nvidia super team to introduce variable referesh rates.

    Yes, that is what you said. The quotation of me above is pointing to the fact that this is what you said. :wallbash:

    Yes I did, because its a good idea if they can pull it off. They invested that money because they will expect to see a pay back from it.

    You said they refused to innovate and in the context of variable refresh rates as well. Nothing we are talking about is relevant to 4k. I don't even know why you are bringing details of 4k development into a discussion about variable frame rates. :confused: Being slow to develop is not refusing to innovate by the way.
     
  18. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I believe what i said was "I wouldn't necessarily want my device in the back of every monitor if i could get the companies that already make those devices to adopt a new technology" As in i wouldn't necessarily want to make them myself, i would want to license the tech to the companies that do make them, once i had shown them there is a market for the tech, or for them to develop it themselves.

    If it makes sense to go it alone because it can be more profitable and less complex, how do you explain AMD providing free sync for free, how do you explain that it would have cost a lot of money to develop both G-Sync & Free Sync, how do you explain that G-Sync is arguably a complex solution.

    But what is better is to either get the companies that make the chips to invest in developing the tech themselves (no risk involved) or to develop it your self and license it to said companies (moderate risk)
    The highest risk (imho) is to not only develop it yourself, but to also make it yourself, when you have little experience and would be competing against companies with years of experience, established contracts, infrastructure, etc, etc.

    So being rather sluggish in developing ASICs for 4K doesn't indicate to you that they would be even slower to develop variable refresh rates ? The former has a target audience of billions (every display in the world, monitors, TV's) The latter has a target audience of at most a few million gamers.
     
    Last edited: 26 Jul 2014
  19. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,284
    Likes Received:
    183
    The first statement appears to me to say you want someone else to develop everything. The second appears to be that you only don't want to manufacture anything.

    All easily explained. AMD already had developed Free-Sync, just not for a gaming application. It was for power saving in laptops. The pay off for its development was inclusion of their graphics processors in laptops because of the reduced power consumption it offered. It utilised a component of the embedded VESA standard. NVidia comes out with a model that in their eyes is the best technical solution to providing variable frame rate. Nvidia were first out of the gate here with variable frame rates applied to gaming.

    The quickest way for AMD to react to that is to modify the laptop processors to do the same thing. Given how quickly they had their laptop demo available after the g-sync unveil, it probably wasn't the most challenging of modifications. They simply ask VESA to add the embedded feature to the next full display port iteration. Which is probably fairly straight forward to do. Its too late for AMD to do anything else. They are too far behind Nvidia to create a similar type of solution. NVidia already have large monitor brands on board.

    AMD have taken the less profitable route here but it was their only choice. They have to react quickly or they will start losing customers. Especially on something as important as this.

    I take it you meant to ask how is G-Sync an arguably less complex solution? For any given system its always easier to build it yourself from the ground up, rather than trying to work with other vendors, agreeing standards, making compromises you may not want to make etc. etc. Only when you don't have the resources either technical or physical to complete an element of a system is it worth involving a third party. Nvidia have the technical and physical resources to complete g-sync on their own.

    If you have made the full eco-sphere from start to finish in the long run it will make future development easier. You want to do something that won't play well with the current g-sync version? No problem just do a firmware update for the g-sync module and carry on regardless. You want to do the same with a VESA standard? tough luck. You have to wait for VESA to make a move. Maybe they won't make the move you want. Maybe it will but its going to take a year or two for other stuff to be agreed. Maybe they sort of make a move you want but it requires serious redevelopment/redoing of the work you have completed. Its always better to design, own and control the full eco-system.

    You discover a bug, it lies with a third party's system component. They check and says its not a problem on their side. And so begins the "problem tennis match" where the problem is being constantly whacked back and forth between two parties and neither one is actually getting the problem solved. You own everything, its easier to get the bugs worked out.

    There is risk in getting a third part involved in creating what you want as highlighted above, its certainly not "no risk".

    Getting someone else to make something you have designed is only removing the manufacturing process. NVidia and AMD are constantly making electronic devices (granted its more than likely contracted out) But there isn't really any benefit in giving out the manufacturing to someone if you already do it yourself or you have trusted partners to do it for you.

    Doing everything yourself from the ground up is only high risk if you don't have the technical and/or physical resources to do it yourself. NVidia clearly have all of that.
    [/QUOTE]
     
  20. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    Sorry for the confusion, i meant both really.
    To me it seems like a sliding scale of risk vs reward, on one end you have someone else introducing something that is of benefit to you without you having to do anything other than nudge them in the direction you want. On the other end of the scale you have to do all the work because no one else will.

    It's what caused AMD to release Mantle, developers had been asking for an API that allowed them to get closer to PC hardware for years, Microsoft was dragging its feet so AMD decided to take a risk and release Mantle, arguably DX12 may not have seen the light of day, or allowed for reduced draw calls if it wasn't for Mantle.

    I'm not sure how long either of them had been working on variable frame rates, but EDP (Embedded DisplayPort) has been a VESA standard since 2008 so AMD isn't wholly responsible as the VESA board is made up of many companies across the industry. That's not to say they didn't propose or develop it, it's just saying EDP wasn't introduced just so one company could sell more GPU's.
    As for what is the better technical solution we will have to wait and see, so far we have only seen Free-Sync work on laptops with EDP, we should get a clearer picture when we get DP 1.2a capable monitors.

    That is assuming that EDP can be easily modified from what i guess is a simple on or off situation, to a more linear shift in frame rates. Yes AMD demonstrated that EDP can do variable frame rates, but (afaik) they didn't provide many details of how it was done, what if any modifications they had to do to the hardware, or any other details.

    Personally i think it's less to do with money and more to do with pushing for a new standard, sure AMD could have licensed Free-Sync and made money from it but it probably wouldn't have received wide spread adoption.

    Well no i didn't mean to ask how G-Sync is an arguably less complex solution, sorry.
    I mean G-Sync is a complex solution to what should be a simple problem, i mean come on a FGA addon board that sits between other parts in a display, i think they could only make it more complex if they had upgradable ram modules ;)

    To me G-Sync seems more like NVidia showing the ASIC and scaler manufactures how it should be done, i doubt there is anything proprietary in the G-Sync module that would prevent them from copying what it does, DP 1.2a monitors should give us that answer.

    Yes it will make future development easier but it will also lead to slow or low adoption rates, what is more interesting to AMD & NVidia is getting wide spread adoption of a new standard that is of benefit to their main business, variable frame rates fix so many problems and bring noticeable advances in graphics technology that will enable them to make cheaper GPUs and sell more of them.

    And a whole lot more expensive.

    When compared to doing it all your self i would say it's no risk, or at least minimal risk out of all your options.

    Unless you want to concentrate on your main business, sure companies branch out into other fields but when that takes focus away from your main business it may not be something you want to do.

    Or if like Microsoft have done (imho) doing so means you have taken your eye of the ball and ended up harming your main business.
     
    Last edited: 28 Jul 2014

Share This Page