Discussion in 'Hardware' started by The_Crapman, 6 Jun 2020.
There should be an AMD option around that time too.
Unknown, unknown, no
According to the table one shop put out...
Don't expect much availability around 3070 launch.
Rumours now circulating that Nvidia have killed off the "double VRAM" Ampere cards
REPEAT:- This is a RUMOUR right now!
Finally something we agree on While there's still cores to unlock and memory to double, we'll see a new titan. Or 3090ti maybe?
Well that sucks, have to wonder if this is a covid issue or an incompetence issue. Any recommendations on what to get as a temporary replacement in that case? Main aim is going to be to play Cyberpunk at 1440p and 144hz on decent settings. Planning a whole new build, so ignore specs below.
I did fancy nvidia for ray tracing but I guess that might be where I end up aiming. Already planning an AMD cpu.
funny enough, according to the original rumour 3070 16GB / 3080 20 GB was never an official Nvidia product to begin with but just AIBs soldering on double RAM... that rumour then morphed into it supposedly being an official Nvidia thing (that is not supposedly cancelled).
Wait for the end of the month and see what AMD brings to the table (Raytracing has already been confirmed by AMD).
Will Bit-tech have a 3070 review up before the launch date on Thursday. I'm torn between getting a 3070 (assuming I win the checkout raffle), whatever AMD announce on Thursday or just waiting for the price of a 2070 super or better to drop!
I rather suspect NDAs have been signed which prohibit publishing reviews before the launch event.
All signs point to the 3070 performing around the same as a 2080TI.
that's a reviewer's guide. So best case scenario. It's also 1440p, indicating to me at least that the 8gb it has is not enough for any sort of 4k. It's also not a good indicator of rasterization performance either.
TBH? in this round I would expect a 4k card for £500. Not want, or like, but expect. For the actual RRP of these sodding cards (about £550 I reckon) you can buy a sexbox and storage upgrade.
That's the actual proper one there, without being chopped down and muddy.
All the affordable 4K monitors are 60hz only anyway (and cheap 4K TVs have a wide variety of problems as well), so many people are sticking with 1440p.
Anyway, the AMD tease showed performance close to the 3080, which logically means Big Navi will be miles ahead of the 3070.
Videocardz reckons the embargo on FE cards lifts tomorrow:
GeForce RTX 3070 Founders Edition, Embargo Tuesday, Oct. 27 @ 6AM PST
Guess the embargo for AIB cards will be on launch day again which is frankly sucky given the likely supply shortage.
Ironically it already is. I say there's irony there because it was one of the tests Nvidia used to make the 3080 look even better. When their contrived paid for benchmarks came out they deliberately pitted the 3080 against the 2080. They then deliberately used 4k so the 2080 would over run its VRAM and look terrible.
Sadly Imgur appears to be down at the moment, but somewhere I have a graph showing the 2080 getting its ass handed to it by a 1080Ti in BL3, where as in everything else the 2080 was faster. It was because the 8gb VRAM ran out.
I would also surmise that is why the canned benchmarks to be used in the reviewer's guide are at 1440p. In other words if I was very careful about the games I picked and the resolution I chose to run them (and the settings) then I could show all of the shortcomings of the 3070 and 8gb VRAM.
Edit. It was Doom Eternal, not BL3.
And that will be why Nvidia have avoided 4k benchmarks for this launch and stuck to 1440p.
Yup and that is the 6800XT with 72cus apparently, not the 6900XT with 80 or the possible XTX with 80 and redonkulus clock speeds.
I'd say AMD have a good few cats for Nvidia's pigeons coming. Especially as they are 16gb and thus much more future proof.
It is. 8gb is not enough in that title. Find the video I took the screenshot from.
It's not just that game either. FS takes more than 8gb at 4k too. The 2080 trumps the 1080ti in every resolution in that game until you reach 4k and it runs out of VRAM against the 10gb of the 1080ti and starts texture streaming.
Look man, not going to argue with you over this. If you need more information then go and talk to Kaapstad on OCUK. He buys every high end GPU that comes out, runs all of this games at 4k and I am telling you there is quite a big handful of them that run out of VRAM. Why do you think I was so pissed at the 3080 having less than the 2080Ti? Because quite simply I reckon these next gen games (when they actually see the light of day) are going to use all of that 10gb and more at 4k on a PC, where you have to load in assets and other garbage the consoles don't even need because they are a set standard. It's part of the reason why COD MW (the new one) is over 250gb now. It's all assets for all types of systems that not every one needs, hence why they are now allowing you to remove chunks of it at the install.
Shadow of Mordor is how old now? that over runs 8gb @ 4k and did when it launched.
Just think about this logically. The 3070 would be the best at 4k no? mostly down to its herculean design, and lack of quickness when all of those shader units are not busy. That is why when you step away from 4k on the 3080 the performance starts to decrease quite dramatically over the 2080Ti, most notably at 1440p.
So why would Nvidia not have the card reviewed at 4k, painting it in the best possible light? "Oh wowzors look this card can do 4k and it's only £500 !" because put quite simply it can't. Like I said, there are many games out there that will over run that buffer and bring it to its knees. Just go and talk to Kaapstad on OCUK if you need more information on the specific titles.
Same or similar engine, old game. Look what happens to the frame times when texture streaming over 8gb.
I'm not sure if you know what it means (texture streaming). It started back on the 400 series IIRC. I had a 470 1.3gb at the time. I was trying to play through BF3, but a certain level would stutter and run like garbage. It was in a shopping mall and you had a sniper rifle and had to shoot terrorists as they came in, protecting the guy you captured. Any way, that level was a stuttering mess and because of that I literally could not complete it. Nvidia did an article about texture streaming, and basically what it used to do was offload those textures to your paging file on the hard drive. Meaning your FPS were all over the place, and at times down by about 70%. The "fix"? was more VRAM. That was when I upgraded to a 6970 2gb. Aced the level in less than two minutes.
Now it has changed. It has moved from your paging file (imagine that on a rusty spinner?) and moved to your RAM, providing you have it. However, physical RAM is nowhere even near as fast as VRAM. Hence it still tanks your performance and your percentiles.
There is no benchmark in Wolf 2. I don't recall there being one in Doom either.
The reason those games slowed so dramatically was because they were texture streaming. Like I said, VRAM runs out, it puts textures elsewhere and then streams them in and out of the VRAM, tanking your performance completely. Partly it sounds like you just want to argue about being right (not sure why, when the facts totally disagree) and partly you say you accept it because it happened to you in the past. Remember, like I said, these are games being ran not benchmarks. Even if they were then I am pretty darn sure that a built in benchmark in a game would be suitably useless as they usually always are, because they do not depict gameplay. They just depict one scene, running at a given level the same each time. Maybe a good way to see what card is faster, but not to highlight every performance issue you are going to encounter throughout the entire game.
And no, I am not just "basing my facts". Those are facts. I've had several cards that ran out of VRAM (the Fury X was another apparently fine there's enough VRAM) card that ran out of VRAM and literally hung your PC. Until AMD "fixed it" by getting game devs to hide settings in games and basically eventually started it texture streaming. Where a good game of COD BLOPS III would go from 60 FPS down to about 4 very quickly as you turned a corner in certain levels. And of course, latency fell off a cliff (hence my problem with shooting terrorists through a scope, by the time the bullet came out they had made it up the stairs).
It's quite easy to test it yourself. Simply enable DSR, then run FCAT or something similar. Here you can see what happens to the 2080 8gb at 5k running an old game.
Those spikes are caused by textures being streamed. Which is not my opinion, or some guy's benchmark or anything else.
Like I said, riddle me this. Why is the 3070 being reviewed at 1440p and not 4k? The 2080Ti is a 4k card. So why not compare apples to apples?
Because they know damn well it would be a horror show. A horror show that will absolutely no doubt be coming to your screen soon (TM) as there is no way Steve @ GN is going to follow those guidelines and nor will Hardware Unboxed. Linus, Jay etc? oh yeah they'll be pulling themselves to pieces no doubt.
The reality is, those with 4K monitors don't normally use any type of AA. So the chances of maxing out 8GB of VRAM that is actually needed and not cached at 4K res, is a little while off yet.
With the likes of DLSS and AMD's equivalent technology, VRAM uses will drop down. So your argument is not really valid, especially on badly optimised console ports.
The fact that you use the term "console port" doesn't exactly indicate to me that you know what you are talking about. They are not "ports". You don't write a game and then "port" it over to a PC. You write code, and then you write the libraries needed to make it work on a PC. That isn't "porting" something it is writing the additional code required to run the game on a PC. It's just code... Text... And textures and sounds. A PC game requires an awful lot of that. Mostly because it doesn't just have one setting. It has tons, and tons of different hardware that you need to create these assets for that don't even exist on a console.
Maybe you should take the time to understand it better.
There will be some scenarios where you can get away with less VRAM use. It's called dropping settings. Any one can do it. However, when you spend as much as Nvidia are asking for these gaming GPUs you don't expect to drop settings. In fact, on the contrary you usually have a very high end GPU right? why is that dude? performance perhaps? OK so yes, it must be. That must be why you sold your 2080Ti off for much less than you probably paid for it (or maybe around the same, depending when you bought it) and got in line for a 3080. Why did yo do that? let me tell you - performance. So, imagine how pissed you would be if you got that card in your rig on day one, only to find you needed to drop settings because the 1gb deficit in VRAM was causing you issues. Please don't even try and tell me you would be happy about that.
Now thankfully 10gb is enough for now. Like, now as I type this. What is going to happen once devs start pumping out next gen games to be ran on hardware twice as fast as they already had in a console? god only knows. However, if 8gb VRAM is not enough for now (again as in now as I type this) for 4k then it won't be any time soon.
OK fine, so it's a 1440p card right? I mean, if you are (for example) jumping from a 1080Ti to a 3070 (which would make perfect logical sense) then already you are giving up options that you already had. Personally speaking I find £500 a lot of money for a 1440p card. Especially when the 2070 Super and 2080 Super already do a fine job.
On the 3070 however we are not dropping settings here. We are literally dropping a resolution. And, at £500 I don't think that is good enough. And I say £500 when I know for a fact that finding one at £500 will be almost impossible, so let's be more realistic and say £550.
AMD can and probably will destroy it before it even gets into the hands of users.
Separate names with a comma.