Discussion in 'Article Discussion' started by bit-tech, 10 Sep 2018.
nVidia sells more graphics cards than AMD? It sure does take a highly-paid analyst to figure that one out!
In seriousness, I strongly suspect the sag in market share is due to people shutting down their mining rigs until the next surge. That hits AMD disproportionately, because Radeons get used disproportionately in buttcoin rigs. It isn't a REAL loss of market share, because buttdiggers are the enemy, in every sense. They're bad for the market, bad for PC gaming(because they price graphics cards out of reach, and people just give up and go console), and bad for the environment(with the amount of electricity wasted on buttdigging, I am PRETTY SURE Satoshi is a Captain Planet villain). They just horde entire truckloads of graphics cards, burn jiggawatts of power to keep them running full-tilt, and don't even USE them for anything. </tangental_rant>
All that said... I really hope AMD gets their act together. Now that the "Bulldozer fiasco" isn't weighing them down, maybe they can allocate the resources they need to make Radeon graphics great again. nVidia likes their market lead too much, and they need to be taken down a notch.
According to the steam hw survey the GTX 1060 reigns supreme, AMD actually already has a card with equal performance (the RX 480 and its renamed version the RX 580), just no one buys those for gaming, AMD being several years behind at the high end doesn't matter all that much as the marketshare of the high end cards like the 1080TI is pretty small as well.
AMD claim Navi will be out this year so maybe they can turn things around, although I'm skeptical of the performance gains to be had from 7nm.
Enterprise only, consumer version to follow 6 month after the UK government understands encryption
Depends how you want to read this statement from AMD...
My emphasis as IIRC AMD have said in the past that 7nm Instinct cards will be retain the Vega architecture.
High end cards encourage low end sales
Nvidia is the known market leader and has performance crown
People know this so the Nvidia cards all sell well as a result it’s a sew saw effect
Vega was basically late to its own funeral and not price competitive enough. Once mining hit that was the end of that.
AMD can only focus its money on one product area we just need to realise it’s the CPUs they are doing and not gpus
Leaves Nvidia with a free reign much fun
I actually run a 580 in my system.
That said.... daaaaamn. Dug the Steam survey up, and Intel HD 4000 shows up before anything from AMD.
Six months after never? That's a harsh launch window.
Back at computex AMD said that their first 7nm GPU will be Vega 20 (and also claimed it won't be a mere shrink while also confirming it won't come to consumer cards), so the only thing we have to go on for consumer cards is "next milestone" and "Navi" which uhhh isn't exactly much, hence my lack of confidence that AMD will produce a competitive high end consumer card any time soon (well that and their track record with Vega).
Also while nvidia has had better luck getting cards released and sold.. it ain't all smooth sailing for them either:
Raytracing in Battelfied V is getting downgraded due to performance issues:
Before mining took off AMD were down to about 18% in discrete shared in 2015, and their cards were more competitive then than now. They were saved by mining, and as that passes I'd expect market share to nosedive.
That just sounds like regular optimisation to me. Things can go the other way too, e.g. they were working with LOD1 models for the raytraced reflections, but found they could switch to LOD0 without any performance impact.
Devs have been working on RTX development using Volta (i.e. Titan V, with no access to RT cores) up until a handful of weeks ago when they got actual Turing hardware, so there's plenty of knobs that need retweaking even on existing implementations, let alone all the knowledge that will be gained in the coming years of dev time on real-time raytracing.
They also weren't doing the ray tracing simultaneously with the compute shaders which RTX allows, or using the tensor cores to do the de-noise (using their own compute shaders instead). Then obviously they can optimise how they ray trace - use cube map after X bounces, only ray trace certain things, ray trace at lower res, etc. I would say it's early days yet, give the devs some time to work with it.
Bingo. There's a lot of interesting presentations from Nvidia on hybrid rendering (both from the most recent GDC, and some Siggraph ones on Volta where - in retrospect - everyone is trying to keep a straight face and toe the 'yes this real-time game-like demo is totally just meant to be rendered on a DGX-1' line), doing some neat tricks like performing an SSAO pass, then shooting ray-traced AO rays at geometry that SSAO is unable to deal with (e.g. where the occluding face is culled), then blending the two buffer to get the final AO buffer. Or using the Tensor cores for edge-detection and then tightly cluster dense rays at the edges for AA. And that's just things they have worked out internally, I'm sure there are going to be plenty more found now hardware is in the wild (e.g. using a similar mechanism to DLSS to 'filter' textures per-frame in order to increase resolution and add variance for multiple deterministicly unique instances).
Visual downgrades being necessary for performance is indeed normal for any game.
However Nvidia having to use such an early (and overly optimistic version of the game) shows just how hard it was for Nvidia to find even the paltry 10 or 11 games they claimed will get raytracing support.
Or in other words, the feature and performance lead Nvidia has is actually slightly smaller than previously assumed.
Basically all of the new sales (and tons from the used market) for AMD's mid-high end for the past year went into mining rigs, so it's no surprise that they are doing poorly in the steam survey. If crypto currencies continue their slide, we may well see all those cards back on the used market at pretty attractive prices.
They were "saved" in the same way you're saved if you sell a kidney to pay the bills.
The buttdiggers drove a good chunk of people that would have used a Radeon as an actual graphics adapter to buy nVidia(because they could actually find a GeForce) or leave PC gaming entirely. And buttdiggers aren't reliable customers, so it isn't like they entered a market with any long-term prospects.
They lost a lot for a short-tem cash boost, unless the flood of used buttdigger cards gets them a boost of actual users that stick around once they aren't getting abused cards dirt-cheap.
Separate names with a comma.