1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia 3xxx series thread

Discussion in 'Hardware' started by The_Crapman, 6 Jun 2020.

  1. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    It's not clickbait. Going Samsung was Nvidia's dumbest decision. It's worse than Fermi. Nuff said.

    TSMC could have handled Ampere easily enough. They've had no issues really with supplying plenty of 7nm wafers to AMD. Nvidia used Samsung to try and get TSMC to cut them a cheap deal, but TSMC had plenty of "work" so refused. Which backfired on Nvidia, who were then stuck with Samsung.

    It's just a simple case of Nvidia once again pissing off people they deal with. Like that time they pissed off Intel, so Intel refused to license them any more sockets which killed Nforce stone dead.
     
  2. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,653
    Likes Received:
    3,909
    Seeing as how there's a lack of titans it must be "best card except titans" which would be the 680, which was also the flagship when released and the best card that generation that wasn't a titna. Then they used the 780 later as well as the 780ti, so it should be 580vs680, then 680vs780 and 680vs780ti.

    So you have performance figures of an ampere card made on the tsmc process to confirm this then?
     
    Sentinel-R1 likes this.
  3. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    7,000
    Likes Received:
    548
    The 680 was the top single card for the 6 series I thought? Even if it wasn't, why wasn't there a different card from that range then?
     
  4. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    $

    680 was mid range Kepler die. 780 was large Kepler. They started segmenting their releases then to pull in the fanboys. They did it with the 8 and 10 series too (980, 1080 when the real big cards were the 980ti and 1080ti).
     
  5. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,379
    Likes Received:
    404
    680 was the flagship, correct. Ti's only came to be with the 7 series.
     
  6. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    It wasn't the flagship Kepler. It was the flagship 600 card if you want to call it that but it was still Kepler and it was tiny compared to the actual flagship, the 780 (and then the Ti as well as the Titans etc).

    GTX 680 was Kepler 294 mm² 28nm

    GTX 780 was Kepler 561 mm² 28nm.

    Seems you are getting confused by Nvidia's branding and numbering. They basically got two numbers out of one technology. Had the 7970 been better than it was the 780 would likely have happened first. But they realised the 680 could keep pace, so they launched the mid range silicon first.
     
  7. Sentinel-R1

    Sentinel-R1 Chaircrew

    Joined:
    13 Oct 2010
    Posts:
    2,379
    Likes Received:
    404
    At least read George's post properly before jumping down people's throats Andy mate... He asked if the 680 was the top single card for the 6 series, which indeed it was.

    The probability that they held Kepler back for 6 series to allow another generational performance leap for the 7 series with minimal R&D is another question - but it still stands that the 680 was the flagship card of that series.
     
    The_Crapman likes this.
  8. GeorgeStorm

    GeorgeStorm Aggressive PC Builder

    Joined:
    16 Dec 2008
    Posts:
    7,000
    Likes Received:
    548
    I mean I wasn't saying the flagship Kepler card, I personally would think people tend to think of a series of the 6xx or 7xx rather than pascal vs maxwell etc but happy to be wrong on that.

    In this instance I was meaning 6xx series, got a top 4 series card (although only compare to a 2 series not the 5 series) then a 5 series, then no 6 series, then multiple 7 series, 9, 10, 20, 30 series, seems weird to leave the 6 series out was my point. There was like a year between the 680 and 780 I think, so I would have thought the 680 would have been the flagship card for a while.
     
    Sentinel-R1 likes this.
  9. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
  10. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    A price that may be well worth paying in the long run...
    Because having TSMC with an absolute monopoly on the production of top end GPUs (as has been the case for far too long) is extremely dangerous as they get to dictate everything from what can be manufactured to how much it costs.
     
  11. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Covid is also a major factor in supply at the moment, Unless your paying for Air Frieght and Nvidia is not doing this by all accounts, Sony and Microsoft have had to pay big money for Air Frieght for there latest consoles
     
  12. Guest-44432

    Guest-44432 Guest

    All I see, is what was mentioned in that video I posted on the previous page. Samsung was a third cheaper than TSMC, but with Nvidia thinking they can dictate a better price to TSMC. Nvidia only cut off thier nose to spite one's face, turning to Samsung to take full production in the end.
    We as the consumer are left with Fermi type GPU's all over again...

    I just hope AMD pull it out the bag this time around, and bring to the table a GPU as fast as a 3080, and more efficient at an under cut price.
     
    Last edited by a moderator: 6 Oct 2020
  13. Osgeld

    Osgeld Minimodder

    Joined:
    9 Jul 2019
    Posts:
    319
    Likes Received:
    100
    upload_2020-10-6_9-5-47.jpeg
     
  14. Guest-44432

    Guest-44432 Guest

    Serious in the fact I'm fed up of clowns like Nvidia and Intel taking the piss with hiked up prices, and marketing that convinces you, you are getting the best deal...

    Must be a reason why AMD have been chosen again to run in the next generation of consoles.
    Also with current spec leaks from navi 21, it seems possible. Time will tell.

    P.S I'm no fan boy of either company, just want fair competition to bring prices down.
     
  15. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    So why don't you apply that same principle to TSMC and Samsung? Because that is the angle I'm coming from... competition to hopefully improve prices in the future.
     
  16. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    If AMD have done what they should have done a few gens ago this round will be good for gaming.

    When they switched to GCN Nvidia laid off the big fat GPUs and started making gaming GPUs like Kepler and Maxwell. Which were cut back, yet could clock balls. This was great for gaming (Pascal was even better). However they wanted to return to the big heffers, this is the end result (similar to Fermi).

    AMD on the other hand? have gone back in the other direction. RDNA 2 should be much more power frugal with higher clocks. It should also have something left in the tank for overclocking. Which is all better for gaming.

    Just like their last awesome launch, the 5000 series. Like, the original 5000 series not Navi 1.
     
  17. Guest-44432

    Guest-44432 Guest

    Because with Nvidia, it is just a bigger profit margin using Samsung. Regardless it they used TSMC for the 3xxx series, we would still be paying $699 for a 3080, and $1399 for a 3090, though the coolers would not of needed to be so big and expensive to manufacture to compensate the heat output that the Samsung node produces.
     
    Last edited by a moderator: 6 Oct 2020
  18. bawjaws

    bawjaws Multimodder

    Joined:
    5 Dec 2010
    Posts:
    4,268
    Likes Received:
    875
    We should ban the word "fanboy".
     
    Sentinel-R1 likes this.
  19. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,653
    Likes Received:
    3,909
    People keep refering to Fermi like it was some god awful thing. The only thing awful about it was the reference cooler and the massive slab of steel it had on top that acted like a storage heater. On a decent cooler it was really tame and could be clocked really well. My gigabyte 480 superoverclock did not have a monstrous cooler, it was a slim card, barely 2 slots and standard PCI card height had was damn near silent. And I had a case with no roof or side panels at the time so I mean silent.

    It may have been power hungry in comparison to previous gens, but sometimes that's needed to overcome barriers in the next level of performance. Graphics cards haven't always had additional power connectors, but that's slowly crept up to 2 8pins as the norm. The more complex they become and the more transistors they have to cram on that's only going to increase.

    And everyone's talking like the 6000 series from amd is going to sip power like that queen sips tea, like their track record of power consumption has been so frugal.... :hehe:
     
    Last edited: 6 Oct 2020
    Ficky Pucker, adidan and Sentinel-R1 like this.
  20. The_Crapman

    The_Crapman World's worst stuntman. Lover of bit-tech

    Joined:
    5 Dec 2011
    Posts:
    7,653
    Likes Received:
    3,909
    Didn't say why are people likening the ampere to fermi though did i :rollingeyes:
     

Share This Page