1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Nvidia 11-series GPUs allegedly confirmed

Discussion in 'Hardware' started by IanW, 27 Mar 2018.

  1. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,151
    Likes Received:
    2,656
  2. yuusou

    yuusou Multimodder

    Joined:
    5 Nov 2006
    Posts:
    2,852
    Likes Received:
    916
    Bar an hour or two every few weeks, the GPUs for sale on nvidias' website are always out of stock, simply because it's the only place to sell them at RRP (thus the limit per customer).
    I look forward to the 11 series however. Glad they're calling the the 11 series instead of 20. 20 just seemed silly.
     
    Arboreal likes this.
  3. IanW

    IanW Grumpy Old Git

    Joined:
    2 Aug 2003
    Posts:
    9,151
    Likes Received:
    2,656
    So, as per Pascal, no gaming news whatsoever. Come back for Computex I guess.
     
  4. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    Not unless you have $399k burning a hole in your back pocket.
     
  5. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Ah, good old WCCFTech. As long as you regularly keep announcing "X will be revealed tomorrow" then eventually X will be announced and you can point to your article from the previous day and yell "Hah-HAH! We were right!".
     
  6. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,070
    Likes Received:
    2,428
    "Allegedly confirmed" is making me twitch... bit of an oxymoron?
     
  7. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    To be fair it was Tweaktown who allegedly confirmed it, although i can neither confirm or allege that. ;)
     
  8. Guest-56605

    Guest-56605 Guest

    I'm beginning to think consumer grade GeForce products are way down the "To do list" of Nvidia just now... :(

    Secondly, if they're released in small quantities then the miners will just send their prices straight into the stratosphere all over again :mad: :mad: :mad:
     
  9. Vault-Tec

    Vault-Tec Green Plastic Watering Can

    Joined:
    30 Aug 2015
    Posts:
    14,886
    Likes Received:
    3,667
    Yup you can forget it for the foreseeable. I would imagine they would want to release their next gen on GDDR6 which Samsung have only just started producing, according to something I read today.

    I would say summer now. Which hey, I am not complaining too much about because this is the first time *EVER* I have bought a GPU and it still be worth the same or more than I paid for it a year later.
     
  10. Guest-16

    Guest-16 Guest

    Yes. As they own the market, AMD have already publicly said next-gen Vega isn't before Q4 and everyone is selling everything they can make, why bother? Both of them are tiptoeing on GPU orders in case the mining market suddenly bottoms out and they're left with a shitton of unsold stock competing with all the second-hand ragged GPUs flooding the market.

    In the commercial and HPC markets they make an absolute ****ton of profit so that's where they are focusing their talent now. Like Intel the PC market is just for harvesting.
     
  11. Gareth Halfacree

    Gareth Halfacree WIIGII! Lover of bit-tech Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    17,085
    Likes Received:
    6,635
    Err... Intel made $34 billion revenue from its Client Computing Group compared to $19.1 billion from its Data Centre Group in FY2017, which if you scroll down becomes $12.919 billion CCG and $8.395 billion DCG operating income. CCG is Intel's bread and butter; it makes significantly less from DCG. It's the same for Nvidia, too: sure, the margins in data centre are higher, but the volume's in consumer. It's even worse for AMD, which barely has any skin in the data centre game anyway relative to Intel and Nvidia - consumer's the lifeblood for AMD.
     
  12. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    If I were Nvidia, I'd be holding off on releasing and cards until it was actually needed, and instead spend time and money stockpiling as much GDDR6 as the supply chain can pump out. That's the limiting Critical Path factor for the moment - even over fabbing GPU dies themselves - so having more on hand means a larger launch volume can be produced even if they hold off on actually taping out any GPU dies for a few more months.
    Why not demand more GDDR5/5X instead? Because that would end up locking you into using that for future cards, or you'd end up on a stockpile of RAM nobody wants. It would also mean fabs keep lines open making RAM you don't want to use in the future (delaying shifting production lines to GDDR6), whereas if you can get them to move to GDDR6 early you can to some extent amortise the line startup cost over a larger batch of orders rather than having a high initial price that drops off over time.
     
  13. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    Even worse, next-gen Vega in Q4 will not be a consumer product, that is just when the enterprise version will start sampling to select customers in small numbers.

    Miners will probably not slobber too much over anything new from either nvidia or amd:

    https://www.cnbc.com/2018/03/26/ana...ew-cryptocurrency-mining-chip-from-china.html

    While there are plenty of non Eth coins to mine on gpus the gpu mining refugees from Eth will drive difficulty into the sky and render them unprofitable to mine.
     
  14. Guest-16

    Guest-16 Guest

    It's a harvest industry that's stable-to-decline. PC is not growth, it's not fashionable, so don't invest in it - that's why since SandyBridge the platform has only had iterative updates. Revenue is not ASP and investors will only ask about margin because it yields dividends. All big vol does is fill your fab utilization, then in turn helps fun the growth projects. Nvidia is doing the same with its consumer cards while it focuses on HPC and newer markets like automotive. MediaTek had its best ever year by revenue in 2016 and the markets hated it because it's ASP dropped from 40+% in 2014 to low 30s.

    AMD literally cannot make enough product, that's the biggest issue. They are 2nd tier status by vol in TSMC and it should move its GPU manuf over to GloFo really, but GloFo isn't cutting it at the same pace as TSMC so they can't afford to be behind (well, they already are...)

    This is why Lisa Su is playing down AMD's exposure to miners by saying single digit % of buyers, however most people expect it to be much higher.
     
    Last edited by a moderator: 29 Mar 2018
  15. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    It's just the continued cycle of thick-client-thin-client that's been going on since the inception of computing. We're in a 'thin client' swing right now (OMG CLOUD! PUT ALL THE THINGS IN THE CLOUD! GAMING IN THE CLOUD! RENDER IN THE CLOUD!), but I expect we're nearing the end from a combination of bandwidth (and latency) starting to hit limits of existing installed capacity, societal pushback of "we put all our data in the cloud, but who is this cloud anyway?", and a new spread of high-workload-latency-sensitive applications perfect for thick clients (VR and AR).
     
  16. Guest-16

    Guest-16 Guest

    Not this time because 5G will specifically see to that. Plus the investments in AI are being made in edge(IoT, smartphone, auto and hybrid-edge/cloud) not PC. PC will remain the creative area for professionals, business and prosumers. Unfortunately gaming is still a small % of total sales.

    Game-streaming and the next gen of SaaS will begin to take off with 5G as e2e latencies come down and entry barrier cost becomes lower, especially if this trend of high component costs doesn't break.
    I co-authored a whitepaper on 5G use cases. There's a lot of positive expectation cross-industry. In some cases where there's line-of-sight and poor broadband you might even get 5G mmWave replace your landline entirely.
     
  17. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    Only if mobile providers massively up data allowances.
     
  18. Guest-16

    Guest-16 Guest

    Yea ALL this is dependent on data caps. Actual session bandwidth is no more/less than Netflix etc because it's basically a compressed video stream down and up is ~USB 2 input and a mono microphone. It's e2e latency that's the issue, so it certainly won't replace global console roll-out quickly. 5G is long-term: 10 years. 4G started ~8 years ago
     
  19. Anfield

    Anfield Multimodder

    Joined:
    15 Jan 2010
    Posts:
    7,059
    Likes Received:
    970
    The big problem is that many mobile providers are also home broadband providers and they will not be willing to cut into their own flesh, so I fear we will see a continuation of the trend of mobile broadband being artificially crippled.
     
  20. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    5G is not even CLOSE to sufficient for upcoming applications. To put things in context, current wired dedicated networks are not sufficient for adequate streaming for VR, it's just too latency sensitive a task. The latency issue already precludes any sort of even vaguely efficient video encoding (you're stuck with schemes that compress line-by-line rather than by frame).
    And that ignores bandwidth. For context, current HMDs could saturate a 10GbE link even if latency were acceptable. For upcoming HMDs using higher refresh rates and higher resolutions, you're looking at multiple tens of gigabits flying about.

    And if you look ~5-10 years out where AR starts to become viable, you now cut acceptable round-trip latency requirements from 20 milliseconds to a handful of microseconds. This is already a challenge for anything other than dedicated hardware sitting right next to the display, let alone expecting to stream that over a shared network.
     
    TheMadDutchDude likes this.

Share This Page