Yield has nowt to do with density. Well, it does, but not in the way this comment suggests. 1. I design a chip. I make 30,000 of the chip. I get 50% yield. I have 15,000 chips; the rest went into the bin. 2. I make 30,000 more of the chip. I get 100% yield. I have 30,000 chips; nothing went into the bin. The chips from 1 & 2 are identical in every respect. All that happened is my fab got better at making 'em. What doesn't happen is that I redesign the chip between 1 and 2, because chip design is long and complex and expensive. Now, I said yield has nowt to do with density, which for the above is true. That said, if you want to get technical - and we always do - then the density of your design can affect yield, but it's not a simplistic trade-off. Let's say I make a chip with 50 transistors on it. Hey, I never said it was a complicated chip. At low density, I can make 100 of them per wafer. At 50% yield on a new process node, that leaves me with 50 working chips per wafer. At a higher density, I can fit 1,000 of them per wafer. At the same 50% yield, I'm left with 500 working chips per wafer. Hooray! Except that the increased density might mean a defect which would have only affected one chip affects ten chips. Or that a defect that would have been marginal at low density is a fail. Like I said, it's not that easy. As for the 5700XT, the fact it's using less than the maximum density available on the process node is entirely unrelated to yield - and it could mean the process has a terrible yield, or it could mean the process has a brilliant yield. There's literally no way to know from those numbers. To put it another way, you could fit 5,000 footballers on a pitch, but you wouldn't have a very good game of football.
Honestly the huge variation just goes to prove that it is very hard to even assess the manufacturing of chips made at the same company especially when other metrics like yields and price are strictly top secret. Never mind comparing it between different companies as a certain someone has been doing for months backed up by nothing.
It's an architectural change with one of the biggest change within the CUDA core since Fermi. Fermi CUDA core, notice 1 INT and 1 FPU: https://www.anandtech.com/show/2849/3 Turing: https://images.anandtech.com/doci/1...8_Updated090318_1536034900-compressed-010.png Ampere: https://www.techpowerup.com/review/...ture-board-design-gaming-tech-software/3.html (Where is Anantech's Ampere deep dive article?) Comparing CUDA core counts is only valid comparing within a single architecture. That kind of calculation is only really useful for 2080 Ti owners.
You've still got this a bit confused, I think. Having 22 men on a pitch that can fit 5,000 men is "pretty poor," but it's also how many you need for a game of football. Cramming more transistors into a small area also isn't a guaranteed efficiency win - in fact, quite the opposite. Two 7nm chips of equal size, one with 500 million transistors and one with 100 million transistors: one of those two is likely to draw a lot less power, and it's not the one that's the highest density.
looks around and say's FFS instead of you I could have had a world record system, to play ... yea jack **** WW2 shooter at 2k (aka 4k) at a whole 20 fps with some crappy water effects that could have been faked with , and rather well done a mirror 20 years ago
Exactly, so in some workloads, 3070 only have 2,944 cores. But in some other workloads, it can have up to 5888 parallel cores. In gaming, we'll likely see lean towards the higher number because I read gaming is FPU heavy. Point is, Ampere CUDA core numbers are far less comparable to Turing, than, say, Turing to Fermi. End of the day, I only rate generational leaps with gains in performance per watt. 3070 220w to 2080 Ti 250w for "similar" performance is rather underwhelming. Actually similar gains to Turing refresh. 1080 Ti is 250w TDP. 2070 Super is 215w TDP. 980 Ti is 250w TDP. 1070 is 150w TDP while performing 12% faster.
What I'm trying to say is that the amount of CUDA cores are not comparable across generation, especially not Ampere vs previous. Agree with everything else, especially this:
Oh he's back! First it was "moan moan moan such a poor generational performance improvement moan moan moan it's a bag of crap just like Fermi moan moan moan" until..... at which point things got EXTREMELY quiet (lol), so now he's back with "moan moan moan performance per watt improvement moan moan moan" but @Anfield has already pointed out that at 1440p but especially at 4k The performance per watt of the new cards is quite a jump beyond the 2080ti. So, @true_gamer, self appointed exalted one, please "educate us with the experience you have from preivous insane builds, and what you have seen over the years with the way things are marketed, along with sources that have links with internal sources" as to how these improvements in perf:watt are not as great as they should be. Perhaps with all your mighty knowledge and wisdom you could assist Nvidia and Samsung?
Is it though? The 4K numbers show: 3080 at 93%, 18% improvement over..... 2080 Ti at 79%, 21% improvement over....1080 Ti at 65% Of course, the numbers tell a different story comparing 2080 to 3080. But one has to remember nVidia broke the mould by releasing 2080 Ti together with 2080. Whereas with 3080 and previously 1000 and 900 series the xx80 Ti was released a few after initial release. This release is also different because 3080 is a high end chip, previously reserved for xx80 Ti, whereas 2080, 1080 and 980 are all mid-range chips.
Thanks for the education but I think you have educated the wrong person seeing as I haven't contributed to this thread
3080 is cracking value, Out of interest has any one in this thread bought one, or more importantly got one and and used it, particularly ay 4k@120 via HDMI 2.1, I am looking to break the 4k60fps barrier on my LG TV which the 1080Ti does not, so this round of cards is up my street but don't know anyone using them yet. Also has anyone seen the cards available yet, feels like there's more chance of seeing Bigfoot?
Well I suggest you take a step back and decide if this is a hill you really wish to fight over because at the end of the day its a GPU that you buy or not buy. There have been some informative graphs so that people can form a reasonable opinion but taking swings at each other doesn't add any value. Personally I'm interested in seeing where the 3070/3080 vs Big Navi fight lands as then we have a fuller picture. I doubt there will be a clear leader so then it comes down to fps per £ with a smattering of heat/power usage thrown into the mix.
think this thread should be closed, you know, until you can actually buy a 3xxx series card. Generational leap in performance - doesn't matter, you can't buy it. Bad power increases - doesn't matter, you can't buy it. Fictional even newer cards - doesn't matter, you can't buy the ones "released" already, So, summing up, may as well forget about any 3xxx hype 'cos it's pretty much a fictional product.
Thread tidied. AGAIN. Play nice, or play elsewhere - and @true_gamer, tone down the ad-homs and sexism, will you? I wouldn't mind a 3080, but my 2080's about five times as much GPU as I really need. More VRAM would be nice, so I can super-resolution bigger images, but that's about it.
If they're as good as they've been touted to be, we'll probably settle back into a neat red-green-red-green... ladder of price and performance.