*pretends to be shocked* nvidia will claim its contingent on a hardware component of lovelace... it'll probably leak [or someone will bodge the drivers to demonstrate] that that's not strictly true... nvidia will either front it out or begrudgingly backport it
Gone are the days where the only difference between different versions of a GPU is RAM quantity and maybe clockspeed. I remember my 1st GPU purchase, a 256MB 8800GT. The clocks were slower than the original 512MB model as standard, my 256MB card was factory overclocked to match the clock speeds of the vanilla 512MB card. But otherwise it was the same GPU. With the BS we've got with the 4080 effectively being 2 different models, it's going to be a pain. That's before we get to price. I spent around £290 inc.VAT for my GTX 970. UK pricing for the xx70 models is insane, and that's particularly true on the Nvidia side when a 3060 isn't enough for stable performance at 1440p. I'm OK with the 3060 for the time being but it was only really a stopgap purchase just to get something that wasn't the iGPU in my new rig. It's been more than enough to get better frame rates than I expected so I'll either stick with it for longer than planned.... or I'll be taking a serious look at getting an AMD GPU for the first time since I bought a 5770 in 2010.
Yeh pretty much, huge increase in tensor core count, over 4x previous gen, big focus on dlss 3.0, and their new 'Optical Flow Accelerator', which is likely why dlss3 isn't backwards compatible. Zero reference was given for pure rasterisation performance, expect about 1.3-1.5 times faster, depending on API, engine, and game. Nvidia know that their strongest suit is in the software and technologies that support the card, and that neither AMD or Intel have a hope in hell of replicating them for 2-3 generations. Nvidia have been shuffling cards about the stack ever since someone remembered they used to do ti and titan cards, back to the 600 series when the 670 was hastily bumped up to 680. It's a trick that been used in the golf club world for about the same time, calling a 6 iron a 7 iron and acting like they've performed miracles to increase the distance of a 7 iron. Calling a 4070 a 4080, especially when there's such a big difference between the 2 4080's, is just a blatant money grab. Everyone was wondering how they were going to solve the excess stock problem. Well, now we know
Well I suppose at least they haven't rebranded the 3xxx as lower tier 4xxx parts yet and charged more for it, that was my expectation.
Right? Like, do I buy a £1,000 graphics card or an Xbox Series S and... 68 months of Game Pass Ultimate?
I mean ... ofc i have 2 PC's 1 basement setup with a 6800 etc and a lounge PC with a RX 6600 ... Maybe its my age and lack of enthusiasm but i play some Rocket League on my little series S and im a happy boy , Gamepass is just great to boot. £20 a month from Smyths interest free 48 months game pass. £1000+ for a midrange ..GPU? ... Jesus . Jensen can go kiss my arse .
Pfft, my 3070 monsters everything I play at the moment. No plans to upgrade this gen (but might be tempted by Ryzen 7000 if its really really good)
Ryzen 7xxx kinda DoA... due to V-Cache versions of it coming early next year. So might as well sit out the launch of the GPUs and wait for early next year, then do a proper upgrade.
Interesting, as I'm on the ITX bus I would probably look at the 7600x performance to replace my 3600x, this didn't look to have 3D vCache equivalent in that article. I don't need 8 cores so like to avoid the extra heat where possible.
I can get behind chunky GPU heatsinks, it should keep the noise nice and low. Not really sure anyone needs to move on up though, given the stagnation in graphics quality generally (consoles and PC!). I'd much rather put the money into a nicer OLED screen than a new GPU.
lOoK aT tHe RayZ tRaCinG! TBH it's feeling like a 2000 series more and more: - unique new feature (or supercharge a feature that not being used in most games) - new DLSS that is exclusive to new series - large parts of GPU dedicated for non-gaming (many tenser cores) The new Ryzen desktop are 7000 series. So it's probably possible to get a double 7900 for example for ultimate bragging rights.
I watched the keynote thinking 'this is all cool and all, but this screams vfx more than games', even their 'game' tech demo. My overriding impression was - Gamers are not the target audience of Ada, but nvidia will sell them to gamers if gamers want to stump the cash. DLSS 3 seems great if the motion is easily predictable [as shown by at least one of the demos being of a car driving in a perfectly straight line]. If the motion is erratic and hard to predict, I can see it falling over and possibly causing visual ****ery as a result of guessing wrong.