1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Do you want to know what I learnt the other day?

Discussion in 'Hardware' started by Parge, 8 May 2015.

  1. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,648
    Likes Received:
    388
    I'm only guessing but would it be to do with how long it takes to go from design to finished product?
     
    Last edited: 9 May 2015
  2. wolfticket

    wolfticket Downwind from the bloodhounds

    Joined:
    19 Apr 2008
    Posts:
    3,556
    Likes Received:
    646
    Maybe the CPU development roadmap is like an oil tanker
    Once you push hard in one direction and it proves to be wrong one you have to ride it out as it take a long time to turn around. Maybe Netburst and Bulldozer are similar attempts to limit losses.
     
  3. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    Exactly. The main thing to take away from Intel and AMD making 6-year-long mistakes is that it takes about 6 years to take a new CPU architecture implementation (rather than updating an existing architecture) from conception to release. With the amount of effort that gets put into the design, it's not something you can instantly drop and go "well, that's not as great as we though, let's try something else" because that leaves you with a company not actually selling anything for 6 years.

    From that link: $5bn projected revenue for gaming market, $1.5bn projected revenue from enterprise vis (i.e. GPUs that are outputting visuals), $5bn for GRID (GPUs outputting visuals but over a network) and $5bn for HPC/supercomputing. As for what they have right now:
    [​IMG]
    (Sadly the 'gross margin' graph has only one labelled datapoint and no axis scale, so it tells us naff-all in terms of ACTUAL margin for other products beyond 'a bit more' or 'a bit less' than the average)
    Overwhelmingly, Nvidia are running on revenue from consumer GPU sales. And while HPC is growing fast, consumer GPUs aren't that far behind. And while HPC margin is high now, at least some of that is due to there not being much competition. CUDA got the early mover advantage, and not many are willing to port otherwise working code to OpenCL and suffer a potential performance drop until the next upgrade cycle rolls around (which are longer for HPC than the consumer or enterprise world). For new setups though, AMD have the lower priced Fire cards, and Knights Landing is rumbling in and looking very attractive for simulation workloads that are parallel, but still heavily computational. Nvidia are well placed to take on new customers with embarrassingly parallel workloads, but they
    have more competition now than in the past. Margins may need to shrink to remain competitive, or to keep a high enough market share that CUDA remains attractive.

    tl;dr Nvidia is making their money from consumer GPUs now, and that market is not shrinking. There is the potential for more revenue to come from HPC and GRID, but that is not currently anywhere close to translating to actual revenue for Nvidia for at least the next few years.
     
    Parge likes this.
  4. Ramble

    Ramble Ginger Nut

    Joined:
    5 Dec 2005
    Posts:
    5,596
    Likes Received:
    43
    I could see it happening but the link you provided showed that Nvidia's bread and butter market is gaming and it appears to be staying that way for at least a few years. HPC is high growth for reasons others have said in this thread, I wouldn't expect Intel to let Nvidia take all of the market.
     
  5. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    So, the point you're making is much the same as the one I am.

    Yes - nvidia is making some money from discrete consumer GPUs now, my initial point was with regards to Intel, and why we've not seen them jump in to discrete consumer GPUs, and subsequently that going forward, gaming GPUs aren't going to be top priority at nvidia.

    HPC is by no means uncompetitive, margins can remain high because it's about value, not cost (at least I've seldom seen outcomes of HPC tenders, or any IT tenders for that matter, where cost has been a decider). End of the day HPC customers don't buy nvidia, they buy the whole stack from IBM/Fujitsu/Bull/SGI/etc, and probably through an integrator, so nvidia is really just a cog (if a significant one) in a pretty big machine. You suggest nvidia have legacy on their side, which undoubtedly they do in many scenarios, however this can just as easily work against them as well.
     
  6. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    My point was that there IS money to be made in discrete consumer GPUs. Intel even tried to get into that market with Larrabee, and dropped it when the architecture was uncompetitive and repurposed it into the MIC/Xeon Phi as a non graphics focussed coprocessor. Intel even tried once before with the i740, which DID get to market (and flopped). The problem wasn't that there isn't money to be made, the problem is that GPU design is hard and Intel had (and still has) far less experience than Nvidia and ATI/AMD, meaning their designs weren't competitive.
     
  7. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    I think we're still trying to make the same point...

    The two problems are one and the same - because it's hard, because Intel has less experience and because nvidia and AMD have the market effectively cornered, there's no money to be made. I don't doubt that with the might of Intel they could make a truly exceptional consumer GPU, perhaps after a few generations, but there's nothing in it for them to do so.

    Selling a consumer GPU relies on winning, headlines, hearts and minds - if you're not comparable with the best-in-class, the pack your bags.

    Selling effectively the same bit of kit for parallel compute workloads relies more on partnering and bundling - you need to be the right cog in the right machine to win. It doesn't matter if you're not best-in-class with one widget, because as it happens you've got plenty of other widgets that make up for it (IEL, x86, Flash, Professional Services, etc)
     
  8. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,909
    Likes Received:
    591
    I think the problem may the use of the phrase 'money to be made'. There clearly is money to be made in discrete GPUs, because Nvidia and AMD are making it. Whether Intel can make money from a discrete GPU is more a problem that Intel are having designing a good one than a problem with the market for discrete GPUs.
     
  9. Mister_Tad

    Mister_Tad Will work for nuts Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    14,085
    Likes Received:
    2,451
    Perhaps my comments were on the casual side, but I wasn't expecting the Spanish Inquisition.

    [​IMG]

    There's no money to be made = The smart money for Intel is elsewhere
    We've not seen them jump in to discrete consumer GPUs = They've never fully committed and backed them as a strategic direction (see previous point)
    Gaming GPUs are no longer a priority for nvidia = they're going to see greater revenue growth from other markets

    So... I think we just agree to... agree? :lol:
     
    Last edited: 11 May 2015
  10. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,887
    Likes Received:
    131
    Well look at the smartphone market no one doubts there's money to be made but unless your name is Apple or Samsung your not making huge cash most are actually loosing money.

    Could Intel make a top class gpu I think they could and would if requested to by OEMs. Apple forced the hd4000 and hd5000 improvements when they threatened to take there business elsewhere.

    If AMD zen cpu can succeed and it has a decent onboard gpu you could see Apple once again pushed a AMD product. These are the sort of places AMD needs to get into. That would then once again force Intel to push better graphics.

    If they ever want to get into mobile there cpu / gpu combo has to be competitive.

    The people buying gpus on forums is a tiny % of those sold most are sold to system manufacturers for complete systems.

    There's money in discrete gpus just not enough to support 3-4 competitors and bearly enough for Nvidia / AMD. After R&D not sure either Nvidia or AMD has saw big profits.

    ATI cost AMD around $4bil they have not made $4bil since then and some would say the route of there problems relates to that purchase.
     

Share This Page