Discussion in 'Article Discussion' started by bit-tech, 2 Jul 2019.
Yep. 1 Video for both articles.
Damn I wish I hadn't bought my 2070 now.
Damn my Gigabyte 980Ti burnt out 3 months out of warranty and Gigabyte told me to do one.
Forced to buy an FE2060 (about same performance) and feel like Nvidia have kicked me in the goolies with this hasty "Super" upgrade to 8GB memory, 256-bits wide bus, 64 ROPs, 2176 shaders and faster clock frequencies.
Just proves how over inflated prices have been in this market segment.
Going by the leaked RX 5700 XT benches and the announcement that AMD is already cutting prices for their new cards there sadly won't be any competition any time soon...
The Intel GPU can't come soon enough
Um, surely the fact that "Super" GPU's even exist, and AMD have already preempted them with price cuts, is EXACTLY the effect of competition in the market??
AMD: "We finally have something that can match the mediocre and grossly overpriced 2070 in some games"
Nvidia: "Unleash 2070 Super"
AMD: "Nothing we can do to compete, so lets cut prices"
Basically AMD is now surrendering in the midrange market as well (in the high end market they already surrendered years ago when they stopped making real GPUs and instead started using two rejects per card).
Not exactly surprising either considering the billions AMD lost due to Bulldozer and with all the talent they had to fire to stay in business at all.
I dunno, 5700 die sizes are only Polaris size, and AIB's have already registered 5800 and 5900 product names, so I suspect Navi has a lot more scalability than people are giving AMD credit for.
I think the realised they had something that was actually able to compete with Nvidia's offering, so purposely priced them where they did, knowing they could easily take a $50 MSRP cut (because of that cheap/small die size) when Nvidia countered.
But whadda I know, eh??
AMD - "let's put out our mid range Navi at a Nvidia like price before we've even launched it"
Nvidia - "quick, someone is stealing our thunder, release moar lolprice Super cards"
AMD - "OK, now let's adjust our price before it's even been released, getting the same price for this mid range card that we were getting for our very expensive to make Vega cards"
IMO AMD will still be quids in. There were leaks naming these cards as 600 series, with prices £100 less than those AMD have settled on. And Nvidia made sure of that.
Vega 56 was around £340 recently, Navi has to be cheaper to make.
Let's also not forget that these prices are still an utter joke, and not get Intel Syndrome and think they're doing us a favour like the time Intel released that £185 2c 4t I3.
It's been several years now, I think people just need to accept that the super-cheap 9xx series (and AMD card price drops to match them) were an anomaly rather than the new rule.
2080TI = $999
1080TI = $699
980TI = $649
780TI = $699
680 = $499
580 = $499
(and going back further in history there is a lot more in the $400 - 600 range, not counting outliers like Titans or dual GPU stuff)
So the price hike from one gen to the next has clearly been extraordinary for the current cards and that is before we get to the added problems:
The premium for custom designs from AIBs has also gone up over the years.
And on top of both of those problems the quid has gone down the drain vs the US Dollar.
ATI 4850 was around £125. ATI 4870 was around £189.
Competition kept prices down, but I won't sit here and make excuses for current prices. They're obsurd, given that the new consoles will be as powerful as a £1200 PC and will cost about half of that.
It's a rip, plain and simple, and there's no excuse but greed.
If you'd told me before Zen came out that I could buy a 12 core 24t CPU and it absolutely demolish a £1800 (at that time) intel and would cost £499 I'd have laughed at you very loudly.
GPU market is a joke. No lies, no bs, no excuses.
It's a joke and all the while it remains that way I'll buy consoles.
It's taken a lot of years for AMD to disrupt Intel's CPU monopoly, Navi could well be it's first step at Nvidia's.
If the rumours of 5800 and 5900 series cards are true, and equally it scales down well to a 5600 and 5500 line-up (RX590 performance at £120 for under 100W?), then they could conceivably beat Nvidia across the entire market spectrum for the first time in YEARS. It won't happen overnight, but with Intel also entering the game soon, we could be in for some relief.
Navi is deffo a step in the right direction.
Once it goes into consoles it should become much cheaper. Otherwise there would be no point making such expensive consoles if they are going to make your pc parts look stupid. Which tbh @4k the 1x does. You can't buy a pc for anywhere near that price that can run 4k gaming in any shape or form.
Only in the same way they current consoles were as powerful as a high end PC when they came out: not even vaguely close, but the optimisations a closed static platform allow make consoles punch above their weight.
Take a look at PCWatch's GPU die size chart. Remember that from 28nm onwards price per unit die area started increasing after hitting the minima at 28nm. It's no coincidence that it also marked an inflection point in pricing as die size continues to grow. 'Free' cost and performance scaling from process shrinks died half a decade ago, and the result has been seen since: more performance needs more die area, more die area costs more.
Also, don't the PS4P and X1X make more use of upscaling, rather than genuine 4K/UHD rendering?
Dynamic resolution and render scale are the magic words.
With the former the resolution is dynamically adjusted depending on how much it needs to drop to hit a target framerate (if present or not depends entirely on game, it also exists on PC although it is rarely used, for example Prey has it as an option).
With the latter the game is constantly rendered at a lower resolution (exists on PC as well and is pretty common as an option).
But yeah, since both are commonly used in current gen console games (and can't be switched off) resolution as we know it here on planet real numbers doesn't exist.
There's also subsampling rendering techniques that use less than one sample per pixel, which range from the obvious (e.g. Chequerboarding is clearly rendering 1/2 the number of pixels in a consistent homogeneous pattern) to more abstract (e.g. temporal sample sharing, ray subsampling, texture-space shading, etc).
When you're also mixing in supersampling and multisampling AA techniques (which can range from rendeirng additional full samples, additional coverage samples, grabbing samples from other frames, local blurring and reconstruction, NN-based reconstruction. etc), and then throw in VR rendering (where there no is no such thing as 1:1 pixel mapping and pixel density varies across the view) the buffer size picked by modern game rendering engines as "this number and no other is the resolution!" can be somewhat arbitrary.
Separate names with a comma.