I take issue - i am both uneducated and don't understand. Put that in your pipe and chew on it grandma.
So it's taken them 10 years to equal the pulling electric from the plug socket power of Fermi ? What they been doing LOL
It's not about the power consumption. TBH? no one really cares about that. They never have and they never will, all of the time the performance leap is there. It's the performance leap that isn't there. That is why it is being compared to Fermi, because Fermi was also disappointing given their track record before it. That said I think many would love a bit of a Fermi bump right now, they just haven't gotten it since Pascal.
From what I have seen no one has really complained about having to buy a new PSU, new case etc. That said these are the die hards I suppose, because every one else is content to wait. But yeah, I would bet PSU sales are up.
According to the chart that's being referred to, Turing (2080ti) is the worst generational leap, followed by pascal (980ti) A chart which I still say is broken but you all seem to love it, even though it disproves your point, so.....
As with almost every modern GPU, average power draw isn't good indicator for getting PSU. Today GPU's are highly transient (they are way better at managing power) and can peak way higher for very short moment, which can trip OCP/OPP. via Igor's Lab I've seen people getting new PSU after initial runs with 3080 on the old one, but few and between - part of the reason probably being how supply of 30xx looks like right now.
Should and will are different things, listen I have owned MANY ATI / AMD cards at this point more than Nvidia ones and there's a few common thing over the decades ATI / AMD will make a good card ATI / AMD will trade punches with the competition ATI / AMD will end up as the better bang for buck but with a power consumption and heat penalty happens every friggin time, you can get almost as good video card for a decent chunk less (that degrades every gen) but it run's hot as hell, its been this way for at a minimum of a decade! AMD's acquisition of ATI turned the king of gaming into bulldozer and second rate GPU's thanks to ATI and its such a resource sink that its taken over a decade to even become an option ... which to date the only reason they are the CPU map is case Intel stuck their thumb up their bum since 2013 ... Nvidia has not
The key for me is how on paper 3080 draws 28% more power for 30-40% more performance compared to 2080 Ti. This is not the generational leap we had been waiting for. I was going to get 3070 with 256 bit bus 8GB drawing 220w, but it looks to have very similar performance-per-watt to 2080 Ti with fatter bus 11GB drawing 250w. nVidia said double the performance per watt on 1st September..........
SLI has slowly been stripped from lower end cards with the last few gens because they finally caught on to how people would rather buy a couple of lower end GPU's for cheaper tha the top and GPU. That's just good business. Gpu power has began to outstrip the demand and complexities of game engines, Nvidia had to introduce ray tracing to make anything more powerful necessary, so it just isn't needed. Plus I bet it ate a massive chunk of driver dev time for a tiny portion of the market. The only thing that should be mourned with the passing of SLI is the cool looking multi card rigs. 290X? You mean the card that was power hungry Hot And deafening You can't take one companies tflop figure and use it for direct like for like with another, no one knows what they've down to calculate it or how it's calculated, like how TDP isn't the same across all companies. Particularly when talking about a console which has less overheads and go betweens from game to chip. I hope it's at least partly accurate, because we need some competition in the gpu market, the distinct lack of which has lead to Nvidia charging whatever they want because there's no alternative.
I had 3x 290s overvolted under water with an OC'd CPU, when running full load it was pulling 1.1-1.2 Kw from the wall didn't give it a second thought, electricity is cheap that was before I understood there were efficiencies to be gained from a better PSU, not that mine was bad just old and tech improved over time, it was about 65% efficient, compared to my upgraded 90+% , that dropped power draw somewhat. I also had some 480s I think, might have been 470s, can't remember now. The current state of multi GPU is disappointing as I would pick up 2x 3080 in a heatbeat if it was supported, this is why Nvidia has pulled SLi for this level of card as it will be better value than a 3090, no one should be overjoyed at its death, it was always a great way to get future performance today, latest cards are only just doing 4k at OKish frame rates, those of us with multiple or big screens want to push more pixels.
They effectively killed SLI completely as they won't create any SLI profiles next year (and AMD will kill Crossfire completely soon enough). Both Dx12 and Vulkan contain toys for game developers to implement multi gpu support (without either Nvidia or AMD having to do any work). So really, you should be pointing fingers at the game developers for the state of multi gpu, not Nvidia or AMD.
Unfortunately like any idea that only amounts to 1% of users game devs will usually skip it and save the cash. Back in the day AMD (well, ATI) and Nvidia used to court devs to get it put in. Sadly that all stopped.
Strictly speaking whilst Nvidia has pulled the SLI connector from these cards the bridge was only there for bandwidth reliability reasons, multigpu over pcie gen4 bridgeless should be no problem.
If the game developer did the work then the work would only need to be done once. If Nvidia and AMD do the work then the work needs to be done twice (or three times if the Intel GPU ever becomes real). So clearly getting the game developers to do it should be the easier (because cheaper) way.
Blimey, the tri SLI 560s were a long time ago. I also had SLI'd 260s, which worked out extremely well for price/performance. I game less nowadays and it'd be pointless for me to upgrade at this point in time, but I'd definitely be interested in two 20GB 3080s for SLI if it was possible as I think I could still get away with a 1000W PSU, as well as it being cheaper. Though, hopefully, when the 4000 series comes along, I'll be looking to upgrade then. As it is, SLId Titan Xps is still dominating over four years on and doesn't show any sign of needing replacing soon as RTX is still not prevalent..
Would you pay for SLI as a subscription bassed service to ensure ongoing support though? Thats what I see thiese 'price hikes' as, its the SLI surcharge to continue ongoing support (not that they are). And thats the long and short of it, if a company can't afford for deevs to be put on it because those devs are better served elsewhere then of course they'll kill support for that feature.