Half Life Alyx and Asgard’s Wrath both used as much VRAM as it can get, over 10GB constantly. Alyx was playable with 970 3.5GB, but Asgard’s Wrath totally unplayable with 3.5GB. With 1070 Ti 8GB, there were reprojection and frame timeouts. Zero problem maxing out with a 11GB GPU. Point is, 4K isn’t the only reason for more VRAM. When I saw 3070 have 8GB, I said to myself I won’t be making the 970 memory mistake again. This upcoming generation really should be 16GB as standard.
Ah see it's quite complex though fella. Some games will eat that much VRAM, but don't need it. As in need it to stop them using your PC in other ways. I demonstrated that recently with COD MW on my 2080Ti, and it was using 10.5gb @ 1440p. However on some cards it uses less, and still doesn't stream. So there is kinda a difference between "Here, you have more VRAM than you need, use it" to "Oh no there isn't enough let's start the streaming from physical RAM and ruin the game". Overall though? I totally agree, it should have had more than that. And so should the 3080. It's almost like Nvidia want to give away as little as possible.. For a reason. Yeah, that reason is because when your card gets utterly slammed by enormous next gen console "ports" ahem, your card will start crying. If you are OK with tuning down settings to fit within the parameters of a card? that's fine. And it's entirely possible too. However, at top end prices you don't expect to have to make sacrifices. In reality the 3070 really needed to be able to take on the next gen consoles. It's bad enough that on top of that £500 you need to find another £500 for the rest of the PC, but if it can't even do what the next gen consoles are promising (the high end ones like the XBSX @ £450) then to me that's an issue. Because the 1440p Xbox costs waaaaay less than the 3070. So to me? that meant 4k. But if there isn't enough VRAM there then it is not a 4k card. There are a few titles my 2070 Super could easily run at 4k. There are some that it could easily run at 4k if it had more VRAM. However, it doesn't. That means 4k Ultra Nightmare on Doom Eternal (which the card could do with ease IMO) is hampered by the fact it would fall to its knees because it doesn't have enough VRAM. And thus a spade is a spade. So the 3070 is a 1440p card, meaning already it is not as "good" as a 2080Ti. Remember there I said good, not fast. Fast only gets you so far, then the latency starts coming in....... AMD know this all too well.
I'll order whatever card is faster than a 3090, be it red or green. But since I don't expect Big Navi to beat the 3090 that means very little until the next generation of cards.
Thread tidied, two members given the rest of the week on the naughty step. C'mon, people, don't make me be the voice of reason here, I don't like it. I still want more VRAM, if anyone cares. Not for games, tho', I've only got a 1200p monitor - for CUDA stuff.
I'll buy the one that gets me the most performance for the dough I have at the time, have run both companies cards without issue with drivers whether single of mgpu, so happy either way. When I previously ran AMD they had much better HDR ability through windows on a TV, didn't have to dick out about with games vs desktop when on HDMI as much as my 1080, would like to know how the landscape is this time, I imagine both must work quite well these days. Plenty of time though I'm not allowing a hardware treat until I hit my weight loss target, lets hope I hit it this gen an not next
3070 FE reviews are in: https://www.guru3d.com/articles-pages/geforce-rtx-3070-founder-review,1.html https://www.kitguru.net/components/...oass/nvidia-rtx-3070-founders-edition-review/ https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/
Rumours do say shops are getting more of the 3070 than they did for the 3080 / 3090... but then again that doesn't necessarily mean much since cheaper cards tend to sell in far higher quantities so it may still end up in shortages.
I still don't understand the hate on AMD's drivers, they've always worked perfectly fine for me. Then again I don't really play games as they come out or really do massive AAA titles so maybe I'm just wandering around in blissful ignorance
from launch they're horrific. Takes me 2-3x as long to benchmark due to drivers that don't work, last minute BIOS swaps, non functioning overclocking.
This is great to see: https://www.techpowerup.com/review/nvidia-geforce-rtx-3070-founders-edition/36.html Although I don't see any 3070 for less than £510. (sort by price, click into it to show asking price) https://www.overclockers.co.uk/pc-components/graphics-cards/nvidia/geforce-rtx-3070?sPage=1&sSort=3
Yeah, I remember LTT talking about one launch were they dropped a driver revision 24 or 48 hours before the launch so all reviewers were frantically updating. Sounds like total BS. Then there was they fun they played with pricing (mainly in the US I think) where they gave retailers a discount at launch, so it didn't appear in the retail pricing, but made the rpice/performance way better. Then 2 weeks later quietly removed the discount... I heard the opposite TBH, most of the speculation is that Nvidia was pumping as much fab time as possible into the higher end cards. I think this launch will be even worse if anything...
£510 for 2080Ti performance this is the bargin of the decade, if I was a buyer it’s the card I would be buying
Agree, great value. Supply will be interesting to understand. 3080 is is also great value for money, if you can get one.
don't forget it has less VRAM though which will impact higher resolutions and any machine learning sort of stuff you may do.