Discussion in 'Article Discussion' started by bit-tech, 28 Sep 2020.
To be better than anything, it first needs to exist, a look around all the shops and the complete lack of anyone posting benches etc outside of press would suggest it is a marketing myth and doesn't exist, therefore it is no better than the card you already own
A bit late to the party, aren't you? It was a well-known fact when before release, due to the known core/clock differences between the two chips. The only people expecting more than that in performance gain were dreamers and the people stoking the hype fire to gain from it.
Sounds like its limited by the fact it only has marginally more power allowance than a 3080, cards that are opened up like the strix OC with 390w do perform quite well, still that price though, will treat myself if I hit my weight loss targets, hoping AMD come in with something 3090 perf but priced like a 3080, yup I want the moon on a stick
Here you go:
Actually, it's not.
The 3090 has "20% more Cuda cores" than the 3080. Part of those additional Cuda cores are not FP32 units (true Cuda cores in Nvidia lingo), but FP32/INT32 units. These units act as Cuda cores in FP32 workloads but can be used for integer workloads as well. This means that the absolute maximum the 3090 can be faster is 20% in fully FP32 workloads plus whatever small percentage the higher-clocked memory offers minus the possibly lower clock on the chip itself. 10-20% is where it should land, 10-20% is where it lands. 20%, as I said, in pure FP32 workloads, which rarely if ever occur in games.
There is some interesting revelations on the current state of play (no pun intended) with these cards on Jaytwocents youtube channel. It is worth a look.
They may yet have to forego a patch to lower clocks on some models.
No it is not worth a look. He's an arsehole who has no idea what he is talking about.
Are you basing this opinion on a previous experience of his content.
The fact you do not like his personality/style of presenting, or are you All up in Nvidia lovecheeks?
Did you take the time to see what he is saying, or are you just being instantly dismissive of him?
May i add that i am genuinely interested how you arrived at your statement, and i am in no way being devisive. Thanks.
His "revelations" were nothing but reiterated information from Igor's Lab. Which would have been OK I guess (even though he tends to borrow ideas from every one else for views, including their data etc). However, the information was wrong.
So immediately people start panicking and cancelling their orders etc because of his video and it turns out that Nvidia simply allowed the VBIOS to clock the cards too high (AIBs went further) and thus they were unstable. It has very little to do with the filters used, and more to do with the cards clocking themselves into instability. Steve @ GN talks about the actual cause of the issue in his downhill video, and basically it was failing to test the clocks properly (they were using Furmark, which is ironic, and it wasn't boosting the cards to their gaming clocks).
Now as for why Jay is an arsehole? because he is an influencer and a dictator. If you tackle what he says on Twitter ETC he will just block you. Meaning he likes to be heard but he doesn't like to listen. I don't really like that, especially when he seems to know so little. His recent job of soldering a shunt resistor back onto a card for example was laughable.
I've been into computers a very long time. 40 years or so. I don't like it when some guy sits and basically preaches to me yet continually gets things wrong and does stupid things. Mostly for money.
Before he hit a million subs I actually subbed to him. However, I soon realised that if I wanted proper unpaid information I wasn't going to get it from him.
I've also heard he's a bit of a bitch, and deliberately went out of his way to upset someone. When they confronted him he blocked them, and then when people tried to get him to apologise he just blocked them too. Not my kinda guy, TBH.
Thank you for prompt and polite reply.
I knew about the item originating from Igor, and probably should have tagged the "original" work.
My question was one that you appeared to answer in your explanation, with regards to the build quality of these cards.
If it is software related tweak, then that is one thing. But i was more intrigued by the OEM'S swapping out potential durability.
The only bench I’m interested in is 2x 3080’s vs a 3090.
If the 20% performance increase of a 3090 over 1x 3080 is as low as it has been said - then surely 2x 3080’s will be legendary?
Well my post lasted 30 seconds before I read that Nvidia has killed SLI. Awesome. Thanks Nvidia.
I had 2x 980’s and they were superb. Shame!
Damn, I didn't finish reading before I started writing.... again.
2 3080s would be great, Multi GPU was supposed to become native to the OS with DX12 so you could almost understand no longer supporting sli but reality hasn't seen many dx12 games able to use it which is a shame.
Separate names with a comma.