1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia 3xxx series thread

Discussion in 'Hardware' started by The_Crapman, 6 Jun 2020.

  1. adidan

    adidan Guesswork is still work

    Joined:
    25 Mar 2009
    Posts:
    19,794
    Likes Received:
    5,588
    I take issue - i am both uneducated and don't understand.

    Put that in your pipe and chew on it grandma.
     
    enbydee and Guest-44432 like this.
  2. spolsh

    spolsh Multimodder

    Joined:
    4 Feb 2012
    Posts:
    1,559
    Likes Received:
    821
    So it's taken them 10 years to equal the pulling electric from the plug socket power of Fermi ? What they been doing LOL
     
  3. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,935
    Likes Received:
    3,713
    It's not about the power consumption. TBH? no one really cares about that. They never have and they never will, all of the time the performance leap is there.

    It's the performance leap that isn't there. That is why it is being compared to Fermi, because Fermi was also disappointing given their track record before it. That said I think many would love a bit of a Fermi bump right now, they just haven't gotten it since Pascal.
     
  4. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,935
    Likes Received:
    3,713
    From what I have seen no one has really complained about having to buy a new PSU, new case etc.

    That said these are the die hards I suppose, because every one else is content to wait.

    But yeah, I would bet PSU sales are up.
     
  5. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,669
    Likes Received:
    3,926
    According to the chart that's being referred to, Turing (2080ti) is the worst generational leap, followed by pascal (980ti)
    [​IMG]

    A chart which I still say is broken but you all seem to love it, even though it disproves your point, so..... :worried:
     
  6. Aytos

    Aytos Minimodder

    Joined:
    9 Aug 2020
    Posts:
    41
    Likes Received:
    31
    As with almost every modern GPU, average power draw isn't good indicator for getting PSU. Today GPU's are highly transient (they are way better at managing power) and can peak way higher for very short moment, which can trip OCP/OPP.

    [​IMG]
    via Igor's Lab

    I've seen people getting new PSU after initial runs with 3080 on the old one, but few and between - part of the reason probably being how supply of 30xx looks like right now.
     
    Last edited: 6 Oct 2020
    Paradigm Shifter and Guest-44432 like this.
  7. Osgeld

    Osgeld Minimodder

    Joined:
    9 Jul 2019
    Posts:
    319
    Likes Received:
    100
    Should and will are different things, listen I have owned MANY ATI / AMD cards at this point more than Nvidia ones and there's a few common thing over the decades

    ATI / AMD will make a good card
    ATI / AMD will trade punches with the competition
    ATI / AMD will end up as the better bang for buck but with a power consumption and heat penalty

    happens every friggin time, you can get almost as good video card for a decent chunk less (that degrades every gen) but it run's hot as hell, its been this way for at a minimum of a decade! AMD's acquisition of ATI turned the king of gaming into bulldozer and second rate GPU's thanks to ATI and its such a resource sink that its taken over a decade to even become an option ... which to date the only reason they are the CPU map is case Intel stuck their thumb up their bum since 2013 ... Nvidia has not
     
  8. Osgeld

    Osgeld Minimodder

    Joined:
    9 Jul 2019
    Posts:
    319
    Likes Received:
    100
    dunno about over there but I paid less for an X470 Asus board than the 650 watt PSU that powers it
     
  9. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    713
    The key for me is how on paper 3080 draws 28% more power for 30-40% more performance compared to 2080 Ti. This is not the generational leap we had been waiting for. I was going to get 3070 with 256 bit bus 8GB drawing 220w, but it looks to have very similar performance-per-watt to 2080 Ti with fatter bus 11GB drawing 250w.

    nVidia said double the performance per watt on 1st September..........
     
    Guest-44432 likes this.
  10. Osgeld

    Osgeld Minimodder

    Joined:
    9 Jul 2019
    Posts:
    319
    Likes Received:
    100
    Hearsay and speculation
     
  11. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,935
    Likes Received:
    3,713
    MSI caught scalping their own customers.
     
  12. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,669
    Likes Received:
    3,926
    SLI has slowly been stripped from lower end cards with the last few gens because they finally caught on to how people would rather buy a couple of lower end GPU's for cheaper tha the top and GPU. That's just good business. Gpu power has began to outstrip the demand and complexities of game engines, Nvidia had to introduce ray tracing to make anything more powerful necessary, so it just isn't needed. Plus I bet it ate a massive chunk of driver dev time for a tiny portion of the market. The only thing that should be mourned with the passing of SLI is the cool looking multi card rigs.

    290X? You mean the card that was power hungry
    [​IMG]

    Hot
    [​IMG]
    And deafening
    [​IMG]

    You can't take one companies tflop figure and use it for direct like for like with another, no one knows what they've down to calculate it or how it's calculated, like how TDP isn't the same across all companies. Particularly when talking about a console which has less overheads and go betweens from game to chip. I hope it's at least partly accurate, because we need some competition in the gpu market, the distinct lack of which has lead to Nvidia charging whatever they want because there's no alternative.
     
  13. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,935
    Likes Received:
    3,713
    :hehe:

     
  14. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,929
    Likes Received:
    726
    I had 3x 290s overvolted under water with an OC'd CPU, when running full load it was pulling 1.1-1.2 Kw from the wall :eek: didn't give it a second thought, electricity is cheap :p :D that was before I understood there were efficiencies to be gained from a better PSU, not that mine was bad just old and tech improved over time, it was about 65% efficient, compared to my upgraded 90+% , that dropped power draw somewhat. I also had some 480s I think, might have been 470s, can't remember now.

    The current state of multi GPU is disappointing as I would pick up 2x 3080 in a heatbeat if it was supported, this is why Nvidia has pulled SLi for this level of card as it will be better value than a 3090, no one should be overjoyed at its death, it was always a great way to get future performance today, latest cards are only just doing 4k at OKish frame rates, those of us with multiple or big screens want to push more pixels.
     
  15. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,061
    Likes Received:
    970
    They effectively killed SLI completely as they won't create any SLI profiles next year (and AMD will kill Crossfire completely soon enough).

    Both Dx12 and Vulkan contain toys for game developers to implement multi gpu support (without either Nvidia or AMD having to do any work).
    So really, you should be pointing fingers at the game developers for the state of multi gpu, not Nvidia or AMD.
     
  16. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,935
    Likes Received:
    3,713
    Unfortunately like any idea that only amounts to 1% of users game devs will usually skip it and save the cash.

    Back in the day AMD (well, ATI) and Nvidia used to court devs to get it put in. Sadly that all stopped.
     
  17. sandys

    sandys Multimodder

    Joined:
    26 Mar 2006
    Posts:
    4,929
    Likes Received:
    726
    Strictly speaking whilst Nvidia has pulled the SLI connector from these cards the bridge was only there for bandwidth reliability reasons, multigpu over pcie gen4 bridgeless should be no problem.
     
  18. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,061
    Likes Received:
    970
    If the game developer did the work then the work would only need to be done once.
    If Nvidia and AMD do the work then the work needs to be done twice (or three times if the Intel GPU ever becomes real).

    So clearly getting the game developers to do it should be the easier (because cheaper) way.
     
  19. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,247
    Likes Received:
    1,805
    Blimey, the tri SLI 560s were a long time ago. I also had SLI'd 260s, which worked out extremely well for price/performance.

    I game less nowadays and it'd be pointless for me to upgrade at this point in time, but I'd definitely be interested in two 20GB 3080s for SLI if it was possible as I think I could still get away with a 1000W PSU, as well as it being cheaper.

    Though, hopefully, when the 4000 series comes along, I'll be looking to upgrade then. As it is, SLId Titan Xps is still dominating over four years on and doesn't show any sign of needing replacing soon as RTX is still not prevalent..
     
  20. Bloody_Pete

    Bloody_Pete Technophile

    Joined:
    11 Aug 2008
    Posts:
    8,435
    Likes Received:
    1,109
    Would you pay for SLI as a subscription bassed service to ensure ongoing support though? Thats what I see thiese 'price hikes' as, its the SLI surcharge to continue ongoing support (not that they are). And thats the long and short of it, if a company can't afford for deevs to be put on it because those devs are better served elsewhere then of course they'll kill support for that feature.
     

Share This Page