Discussion in 'Article Discussion' started by bit-tech, 29 Jan 2019.
'sales of certain high-end GPUs using Nvidia's new Turing architecture were lower than expected.'
No s**t sherlock. That's what happens when the price goes up nearly £500 for the equivalent next gen model.
It's almost like Nvidia thought the rest of the world was doing as well as they were up until RTX launch - Kinda glad they have had a shot of reality... Hopefully the hangover clears Jen's head.
I understand that the R&D costs for the RTX series must have been significantly higher than previous, but there's a point where you look at the cost of the card and think "I can buy all the other components of a monstrous system for the price of that card alone. That makes no sense, I'm out."
I was almost at that cash/value balancing point with the 1080Ti when I bought that. Seems a bargain now!
Man, the OG Titan really was the death-knell of Nividia. Or maybe that was the 8800 Ultra? Dang, the GeForce 256* really killed off Nvidia's future prospects. It was all downhill from the Riva TNT2 Ultra!
*for a bit of fun:
^ Hahaha. Very well done and totally agree; it's not like we haven't seen this before, clearly!
I was more amazed at their own shock that particular high-end SKUs didn't sell so well. When you think about all the R&D, rare metals and components that go into lets saaaaaay... a combination of a top end motherboard, Intel's latest top end desktop i7/i9 and the RAM to stick in it and a high end PSU also - and you get all these major components for a similar price or often less than a single 2080Ti, it starts to look incredibly bad value, regardless of whether it actually is or not - and I think that's the point that consumers have reached.
When you have to contemplate dropping £1100-1400 on a 2080Ti, which last gen was £650-800 (not including mining price fluctuations), it's a hell of a markup to swallow.
To be fair, they are still killing it, 2.5 billion is hardly small change and a 55% margin, nice.
Absolutely agree, and yet 500 million isn't an insignificant sum, even to them - so let's see how they respond!
GPU prices are crazy, £600 is the limit I will pay. £1100 to £1400 for a top end card is mental in all honesty. This has been coming for a while, The lack of a competitive market place is really hurting the GPU sector. Technology in general is hitting the break points on price across the industry.
I usually upgrade my mid range gpu every 2 gens but not interested with these prices in the slightest.
Greed is what it is without a doubt.
Taking the monopoly pricing a bit too far for 3 years and introduced with a card architecture that's ahead of it's time (and underwhelming comparative performance to the previous Gen) at an even higher price than people have the appetite for - For us mere mortals it's no no big surprise.
That, plus the issue with longevity...
As we all know AMD recently announced 7nm GPUs and both AMD as well as Nvidia GPUs are made at TSMC.
So if we assume that TSMC told AMD that their 7nm process is ready for chips as big and complex as a GPU (and AMD ain't just talking out of their a**) then inevitably TSMC will have told Nvidia the same thing.
In other words:
The inevitable RTX 2180TI (or whatever they end up naming it) die shrink of the 2080TI might be pretty close.
Every 7nm chip released thus far has been teeny weeny. The scaling problems faced by Intel, Global Foundries, and Samsung are not ones that TSMC is magically immune from. If you're expecting large monolithic GPUs with improved performance and reduced cost to come out of 7nm, you're going to be immensely disappointed for at least the next year. The 2080Ti is Bloody Huge, and is only as 'affordable' (per die area, compared to every past IC of that size) as it is due to years of process refinement achieving very high yields. I would be willing to bet a decent amount that the high end of the Turing +1 generation will either still be on 12nm (or whatever TSMC call their latest minor reshuffle of 16nm), or not release until ~2020 when the second generation of 7nm is available and (if that, and I'd hew to the former).
This year, expect at best small 7nm GPUs aimed at mobile or otherwise low power devices (where performance can stay the same and price can go up a bit in exchange for a drop in power consumption), or maybe some large 7nm dies at eye-watering yeah-I-totally-brought-a-V100-at-launch costs aimed strictly at HPC.
Except that that changed with the recent announcement of the 7nm AMD GPUs (unless AMD was telling porkies and it was a paper launch of a far out product).
Estimates for the die size are 330mm²; for comparison, TU102 is 775 mm², and even GP102 is 471mm². It's smaller than the R9 280! And it still manages to be the largest 7nm die yet announced. On top of that, it needs to sit on an interposer with a pair of HBM stacks, which will further drop yields (packaging onto a interposer is not a reversible step, if a defect is introduced that whole package is kaput unless you can fuse off the affected die).
If you thought Vega 10 cards were in short supply at launch, Vega 20 is going to be even worse.
I suspect Nvidia will have a monster compute chip @ 7nm around about the end of the year. They can charge so much for those things the 7nm costs don't matter so much. That said if you had made all the turing chips @ 7nm the die sizes would have been similar to previous generations, so it does look a bit like they originally designed them with 7nm in mind, but then switched to 12nm.
As for greed, well Nvidia being short of cash is going to make them want to charge more not less, or at least give us less for our money...some of these comments make little sense. The only way to get them to charge less is competition which is what AMD needs to start providing.
Lisa confims Navi launches this year. That's launch-es.
Be surprised to see them with HBM.
My expectation for Navi (rather than Radeon VII) is that we will see them priced similarly to existing Polaris-based cards, with similar performance or a small upllift - from Vega ported features like packed-math, and possibly from GDDR6 though bandwidth starvation has never been the Achilles of GCN - and lower power consumption except for the top-range cards, which will likely follow AMD's historical behaviour of being clocked as high as possible at the expense of efficiency. This would follow the pattern of mainstream AMD GPU releases set since the introduction of GCN (2048:128:32 continues to be the workhorse GCN layout).
I agree in that I imagine this year the 'launches' will be maybe for a bump up for a 580/590 replacement and maybe a mobile release?
I guess as with most AMD GPU releases we will just have to wait and see
Separate names with a comma.