Nvidia reportedly want to change the power connector again! https://www.techpowerup.com/319242/...ut-16-pin-pcie-gen-6-power-connector-standard
MLID making **** up? Shocked. Shocked I say. Place your bets on when he quietly deletes the video as he is known to do when his 'leaks' turn out to be bollocks.
Nvidia finally stop making GTX16x0 cards https://www.techpowerup.com/319940/nvidia-geforce-gtx-16-series-finally-discontinued
Wow, I had no idea that GTX16x0 cards were still on the go from 2018! I don't feel so bad about my ITX sized GTX1650 then.
News to me too. I thought they were toast around the 30 series launch and it was all just excess stock selling through.
Do we think that a 5800X3D will struggle with say a 5080 when it comes out? The 4090 seems to pair nicely with it, I can't see there being 'that' much of a bottleneck if there is one?
Newer high-end CPUs will still provide an improvement but, at the resolutions and settings you're likely to run that kind of card at, I don't see CPU bottlenecks as that much of an issue. I've long said it'll be NV 5000 series or after before I even consider upgrading my 3090, and I'll still probably keep my 5800X3D.
I was reckon that'll end up being the case, the isn't 'that much' between a 5800X3D/7800X3D from what I've seen, nor between DDR4/5 in reality.
Rumour has it the RTX5090 FE will either be a 4-slot, 3-pcb monster based on the 4090Ti "Cinderblock" prototype, or a much smaller 2-slot design. 4-slot - https://videocardz.com/newz/nvidia-...ture-16-gddr7-memory-modules-in-denser-design 2-slot - https://videocardz.com/newz/nvidia-...ored-to-feature-dual-slot-and-dual-fan-cooler I know which I'd rather see!
Kimi has a really good track record on this stuff. A dual slot flagship card would be awesome. Of course, the AIBs will likely carry on with laughably oversized cooler designs.
That and they'll probably want to *just* scrape under whatever the bar is for allowing export to china
So you think the state of AMDs GPU business will prompt nvidia to pull the 90/80 SKU switcheroo, and then release the actual 90 SKU as a Titan or some such other mythical beast? I really can't see them getting away with that, unless the price point drops like a stone.
5080-as-5090 at 1400 currencies so nv can say 'look, you're getting more for less... aren't we amazeballs!' 'real' 5090 is kept for 'ai' where they can add a 0 to the end of the price knowing ppl will still rip their arm off for it... I'd love to be proven wrong though. EDIT: Chart and nvidea typos not mine but... demonstrates adequately i think where we as consumers/gamers fit into nvidia's thinking... i.e. not very much...
Frankly I’m more interested in what Intel are doing with Battlemage. AMD do compete on raster performance, but they still lag in some of the fancy stuff like RT. Their market share is absolutely tiny in comparison, and by their own admission their graphics division is seeing massive drops in revenue. Nvidia don’t give a toss about PC gaming, that hasn’t been their core market for a long time, arguably since the days of the various crypto-booms. They’ll release overpriced under-spec’ed parts that probably won’t deliver much of an improvement on 4000 series. The cards will sell by the bucketload, while Nvidia continues to make bank in the latest fad market of “AI”. Intel Arc, on the other hand, has had a very promising trajectory over the last year, they continue to deliver performance improvements in drivers. It was, however, a very rocky start and their GPUs still have extremely high power budgets compared to their performance. And there are still compatibility issues. I said a long time ago: This still holds. It is getting better, but it was a bit of a botched start. Like them or loathe them, you need the likes of LTT, GamersNexus, HardwareUnboxed, JayzTwoCents, etc,coming out and saying "yes, buy this if you want a well performing card at a good price". What they more or less said was "ehhhhhh, it's kinda not great right now but might get better, right now this is really only for early adopters". They need to do better with the Battlemage launch, the market desperately needs competition for AMD and Nvidia. And if you needed and illustration of just how much Nvidia doesn't give a f--k about PC gaming any more, there it is. Their gaming revenue just about covered the tax on their profits.
The irony is that their B200 data centre chip can consume up to 1.2kW per GPU. Edit: moved rant to another post.