1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia explains how much better the RTX 3090 is than other cards

Discussion in 'Article Discussion' started by bit-tech, 28 Sep 2020.

  1. bit-tech

    bit-tech Supreme Overlord Lover of bit-tech Administrator

    Joined:
    12 Mar 2001
    Posts:
    3,676
    Likes Received:
    138
    Read more
     
  2. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,932
    Likes Received:
    727
    To be better than anything, it first needs to exist, a look around all the shops and the complete lack of anyone posting benches etc outside of press would suggest it is a marketing myth and doesn't exist, therefore it is no better than the card you already own :D
     
    Last edited: 28 Sep 2020
  3. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    A bit late to the party, aren't you? It was a well-known fact when before release, due to the known core/clock differences between the two chips. The only people expecting more than that in performance gain were dreamers and the people stoking the hype fire to gain from it.
     
  4. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,932
    Likes Received:
    727
    Sounds like its limited by the fact it only has marginally more power allowance than a 3080, cards that are opened up like the strix OC with 390w do perform quite well, still that price though, will treat myself if I hit my weight loss targets, hoping AMD come in with something 3090 perf but priced like a 3080, yup I want the moon on a stick :D
     
  5. Mr_Mistoffelees

    Mr_Mistoffelees The Bit-Tech Cat. New Improved Version.

    Joined:
    26 Aug 2014
    Posts:
    5,254
    Likes Received:
    2,493
    Here you go:
    [​IMG]
     
    jb0 and sandys like this.
  6. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    Actually, it's not.

    The 3090 has "20% more Cuda cores" than the 3080. Part of those additional Cuda cores are not FP32 units (true Cuda cores in Nvidia lingo), but FP32/INT32 units. These units act as Cuda cores in FP32 workloads but can be used for integer workloads as well. This means that the absolute maximum the 3090 can be faster is 20% in fully FP32 workloads plus whatever small percentage the higher-clocked memory offers minus the possibly lower clock on the chip itself. 10-20% is where it should land, 10-20% is where it lands. 20%, as I said, in pure FP32 workloads, which rarely if ever occur in games.
     
  7. Locknload

    Locknload Jolly Good Egg

    Joined:
    28 Jun 2009
    Posts:
    241
    Likes Received:
    24
    There is some interesting revelations on the current state of play (no pun intended) with these cards on Jaytwocents youtube channel. It is worth a look.

    They may yet have to forego a patch to lower clocks :( on some models.
     
  8. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,964
    Likes Received:
    3,733
    No it is not worth a look. He's an arsehole who has no idea what he is talking about.
     
  9. Locknload

    Locknload Jolly Good Egg

    Joined:
    28 Jun 2009
    Posts:
    241
    Likes Received:
    24
    Interesting!
    Are you basing this opinion on a previous experience of his content.
    The fact you do not like his personality/style of presenting, or are you All up in Nvidia lovecheeks?

    Did you take the time to see what he is saying, or are you just being instantly dismissive of him?
    May i add that i am genuinely interested how you arrived at your statement, and i am in no way being devisive. Thanks.
     
  10. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,964
    Likes Received:
    3,733
    His "revelations" were nothing but reiterated information from Igor's Lab. Which would have been OK I guess (even though he tends to borrow ideas from every one else for views, including their data etc). However, the information was wrong.

    So immediately people start panicking and cancelling their orders etc because of his video and it turns out that Nvidia simply allowed the VBIOS to clock the cards too high (AIBs went further) and thus they were unstable. It has very little to do with the filters used, and more to do with the cards clocking themselves into instability. Steve @ GN talks about the actual cause of the issue in his downhill video, and basically it was failing to test the clocks properly (they were using Furmark, which is ironic, and it wasn't boosting the cards to their gaming clocks).

    Now as for why Jay is an arsehole? because he is an influencer and a dictator. If you tackle what he says on Twitter ETC he will just block you. Meaning he likes to be heard but he doesn't like to listen. I don't really like that, especially when he seems to know so little. His recent job of soldering a shunt resistor back onto a card for example was laughable.

    I've been into computers a very long time. 40 years or so. I don't like it when some guy sits and basically preaches to me yet continually gets things wrong and does stupid things. Mostly for money.

    Before he hit a million subs I actually subbed to him. However, I soon realised that if I wanted proper unpaid information I wasn't going to get it from him.

    I've also heard he's a bit of a bitch, and deliberately went out of his way to upset someone. When they confronted him he blocked them, and then when people tried to get him to apologise he just blocked them too. Not my kinda guy, TBH.
     
  11. Locknload

    Locknload Jolly Good Egg

    Joined:
    28 Jun 2009
    Posts:
    241
    Likes Received:
    24
    Thank you for prompt and polite reply.
    I knew about the item originating from Igor, and probably should have tagged the "original" work.

    My question was one that you appeared to answer in your explanation, with regards to the build quality of these cards.
    If it is software related tweak, then that is one thing. But i was more intrigued by the OEM'S swapping out potential durability.

    Cheers.
     
  12. monty-pup

    monty-pup Minimodder

    Joined:
    8 Apr 2018
    Posts:
    206
    Likes Received:
    45
    The only bench I’m interested in is 2x 3080’s vs a 3090.

    If the 20% performance increase of a 3090 over 1x 3080 is as low as it has been said - then surely 2x 3080’s will be legendary?

    ————

    Well my post lasted 30 seconds before I read that Nvidia has killed SLI. Awesome. Thanks Nvidia.

    ————

    I had 2x 980’s and they were superb. Shame!
     
  13. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    Damn, I didn't finish reading before I started writing.... again. :D
     
    monty-pup likes this.
  14. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,932
    Likes Received:
    727
    2 3080s would be great, Multi GPU was supposed to become native to the OS with DX12 so you could almost understand no longer supporting sli but reality hasn't seen many dx12 games able to use it which is a shame.
     
    monty-pup likes this.
Tags: Add Tags

Share This Page