1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD recreates G-Sync, Nvidia plays it down

Discussion in 'Article Discussion' started by Meanmotion, 8 Jan 2014.

  1. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,650
    Likes Received:
    12
  2. Sloth

    Sloth #yolo #swag

    Joined:
    29 Nov 2006
    Posts:
    5,634
    Likes Received:
    208
    #shotsfired

    Kudos to AMD if they can get a non-proprietary desktop solution.
     
  3. SAimNE

    SAimNE New Member

    Joined:
    23 Oct 2012
    Posts:
    122
    Likes Received:
    0
    wow... power at low cost with the only draw back being it's thermal design generating a lot of heat(nothing a water cooling bracket wont cure for your case, a low level api being brought into the market to boost performance, and now defacing the competitions main advantage for this generation.

    amd had one &$&# of a turnaround on their marketing and design proficiency recently 0_0... not to mention the obviously aggressive tactics.... tbh i'm loving this. hopefully HSA and mantle take off as well and turn amd into a comeback wondercompany again
     
  4. CowBlazed

    CowBlazed New Member

    Joined:
    9 Dec 2005
    Posts:
    254
    Likes Received:
    0
    "Furthermore the spokesperson believed that many of the problems solved by G-Sync could simply be resolved with triple-buffering, pointing out that there used to be an option in AMD's drivers to force this on and that it could easily add this back in."

    It never left their drivers as far as I can tell though isn't it only supposed to work on OpenGL? The option still exists in my current 13.12 drivers.

    That also glosses over the disadvantages of tripple buffering (mainly, severe hit to frame rate). If it was really so great wouldn't we be using it?
     
  5. ch424

    ch424 Design Warrior

    Joined:
    26 May 2004
    Posts:
    3,112
    Likes Received:
    41
  6. schmidtbag

    schmidtbag New Member

    Joined:
    30 Jul 2010
    Posts:
    1,082
    Likes Received:
    10
    Agreed, but, what I don't get is why couldn't this work with current G-sync monitors? If nvidia's counter-argument is actually true, then they basically paid to do all the hard work for AMD and all AMD has to do is a few tweaks here and there to support G-sync.

    Either way, neither companies are at each other's throats in this situation - nvidia has desktops covered, AMD has laptops covered. Nvidia put much more of a focus on this, but AMD has had the technology longer. Eventually, both companies should be able to cover the bases they're missing.


    As I said in one of the other G-sync related articles - if you've got a 120Hz+ refresh rate and a game that can play at 120+ FPS, you're probably not going to see a difference with this sync technology anyway. The irony is stuff like G-sync is great for mid-range GPUs, but if you can't pay for a good GPU, how do they expect you to pay for a compatible monitor?
     
  7. .//TuNdRa

    .//TuNdRa Resident Bulldozer Guru

    Joined:
    12 Feb 2011
    Posts:
    4,042
    Likes Received:
    109
    "Something tells us G-Sync won't be along in the market for long..." Second to last line, something seems a bit off there.

    On topic; I didn't think it would be too long before AMD addressed G-sync, although I would like it if the system became open, or at least someone reverse engineered it so G-sync boards can run with AMD cards,
     
  8. iggy

    iggy Active Member

    Joined:
    24 Jun 2002
    Posts:
    1,022
    Likes Received:
    11
    a proprietary interface that can only be bought with a new monitor is an advantage? nvidia arent holding many cards this round are they?
     
  9. SchizoFrog

    SchizoFrog New Member

    Joined:
    5 May 2009
    Posts:
    1,574
    Likes Received:
    8
    What cards are AMD holding right now?

    High-end CPU performance? Nope
    High-end GPU performance? Nope
    Power efficiency (Laptop and Desktop) Nope
    blah blah blah...

    Don't get me wrong, AMD are holding their own, or at least keeping pace as usual but they are hardly ahead in the game. People keep calling the consoles a 'major win' for AMD but it was reported many times that nVidia made very little from their involvement last generation and profits may take up to 5 years to actually filter through for AMD if... IF... they get any when the dust settles.
    AMD has been asset stripped for years and there is very little to sell off any more if they need further cash investment. AMD's APUs are just about catching up to Intel (with regards to CPU performance) after what, nearly 5 years of standard 10-15% increases in performance? I would not be surprised if Intel's next serious CPU line update leaves AMD far behind once again and next time AMD may not be able to recover.
    What I don't understand and please correct me if I am wrong but people always seem to colour nVidia in a negative light with regards to them developing propriety software and then the same people praise AMD when they came out with Mantle... Isn't Mantle propriety software?
     
  10. loftie

    loftie Well-Known Member

    Joined:
    14 Feb 2009
    Posts:
    2,916
    Likes Received:
    149
    After one too many beers and a suspicious curry last night e can't be dragged away from the bog. Luckily g offered to step in and substitute hoping no-one would notice at least until e was feeling better. Unfortunately no-one told g they were facing the wrong way when the picture was taken.

    At least that's what c said....


    Nope it's not (but who actually knows until it's here and on other cards). But if you're trying to say they're hypocrites, when AMD announced Mantle it was thought that it'd be AMD only iirc, so you're point would still be valid.
     
    Last edited: 9 Jan 2014
  11. isaac12345

    isaac12345 New Member

    Joined:
    20 Jul 2008
    Posts:
    427
    Likes Received:
    3
    HAHA! This is brilliant! Why dont AMD introduce a temporary solution(till they can automate it) where the vblank setting of the monitor can be selected by the user. All they would have to do is look up their monitor manufacturer's specification sheet. Obviously I am assuming here that vblank is a fixed value.
     
  12. SAimNE

    SAimNE New Member

    Joined:
    23 Oct 2012
    Posts:
    122
    Likes Received:
    0

    high end cpu: the fx series can actually trade blows with the i5s in gaming for this generation now that proper usage of multi-cores now.
    high-end-gpu: in terms of single gpu cards they have just as much of an advantage as anyone... amd and nvidia are the only graphics cards to even consider, and both always have comparable cards for every level.
    power efficiency: the apus have a great power consumption compared to performance in low level gaming and mediapcs.
    and now they are beefing up the software and pulling out to a clear advantage with mantle. nvidia advertises stability over frame rate and amd just flat out took that away here.

    so yeah.... in comparison to nvidia amd looks like they are kings.... intel really still have a decent advantage in a few places, but one can hope on next generation they'll come out with the edge in both sides of the company.
     
  13. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,650
    Likes Received:
    12
    Ta, fixed.
     
  14. DbD

    DbD Member

    Joined:
    13 Dec 2007
    Posts:
    494
    Likes Received:
    10
    So Nvidia introduce a new tech with much fanfare and lots of positive reviews from journalists then AMD say "ah actually that's easy - we've been able to do this for 2 gens on standard hardware". Only they never have, and have no plans to productize this, and only demo'd with what looked like a video not a game, and on that at a pretty fixed frame rate (steady at about 49fps, not truly variable)?

    Something a bit fishy there - either it is easy and they could (Should!) have done it years ago, or what they have doesn't really work outside of some fixed test environment. tbh sounds like typical AMD me too power point marketing - nvidia develops and releases some proprietary tech, AMD then promise the free/open coming soon only it never really arrives. See stream, bullet physics, AMD's 3D, etc
     
  15. Harlequin

    Harlequin Well-Known Member

    Joined:
    4 Jun 2004
    Posts:
    7,071
    Likes Received:
    179
    nothing is `fishy - they are using a VESA standard , which NVidia cant charge for - which they want to do for G-Sync.

    nv PR must be working overtime to knock the fact that this can be FREE. a bit like Mantle as well.....
    STREAM - here and now with the AMD APP (see it got renamed...) using OCL.
    seems to be doing quite well in compute don't you think...

    As for bullet physics - GTA V uses it , or is that AAA title not big enough for you??

    AMD HD3D is also out there - just because you haven't used it or have a need doesn't mean its not available.
     
  16. Meanmotion

    Meanmotion bleh Moderator

    Joined:
    16 Nov 2003
    Posts:
    1,650
    Likes Received:
    12
    To be fair, as Nvidia points out, it currently can't be done for free because desktop monitors don't support it. It's quite often the way of these things that an outsider will need to show the way forward as Nvidia has done.

    It's also the case that although some laptops will already be physically capable of doing free-sync, there isn't a standard for communicating back to the graphics card what the vblank should be. AMD presumably had to hardwire this setting into their hacked demo.

    Ultimately, Nvidia wouldn't be stupid enough to make and try and sell something that could be done in the blink of an eye for free. Sure, it may only take one generation of products before some other solution is found but between now and then Nvidia has a 1-2 year window (these things take time, especially where standards are involved) to sell G-Sync and get the good credit for pioneering the technology.
     
  17. Harlequin

    Harlequin Well-Known Member

    Joined:
    4 Jun 2004
    Posts:
    7,071
    Likes Received:
    179
    its only NVidia saying it cant be done with desktop screens - and amazingly , since they have a $175 competing product , they have a vested interest to play down the VESA standards.


    IF AMD can use ` quad buffering` (which takes a lot of ram - oh look what's shipping on the latest cards) , to offload the ram requirement from display port - this is doable.
     
  18. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,538
    Likes Received:
    363
    If nothing is fishy then why wasn't this introduce long ago ?
     
  19. Harlequin

    Harlequin Well-Known Member

    Joined:
    4 Jun 2004
    Posts:
    7,071
    Likes Received:
    179
    not so long ago frame rate was king - max frame rate no matter what , look at the `fishy` things both teams have done - quack3 for amd and not rendering off rail for benchmarks from Nv.

    now though, since gfx are looking far far better , smoothness is needed. You want the quality and the gfx card cant keep up , something has to give - frame stuttering - the ambiguous ` lag` from teen twitch kill fps players ; so how to solve that without getting faster and faster ( = more expensive) hardware? slow down the display rate.

    it wasn't `needed` 2 years ago like it is now - we get movie quality games that is `demanded` to be displayed at full HD @ 60 fps - even the likes of Titan et al struggle at times.
     
  20. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,538
    Likes Received:
    363
    But it was needed 2 years ago, it's not just about perceived lag its about tearing when VSYNC is disabled.
     
Tags: Add Tags

Share This Page