1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Graphics Do you want to know what I learnt the other day?

Discussion in 'Hardware' started by Parge, 8 May 2015.

  1. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    12,868
    Likes Received:
    552
    I went for a couple of drinks with Scott Wasson, Editor of The Tech Report

    [​IMG]

    We spoke about CPU's, VR, GSync, ARM, Intel, AMD and just about everything else.........including GPUs.

    He was explaining how important for the whole GPU market it is that AMD put out a good card this year. I mentioned that I'd heard their market share was around 30%. He replied with "that is absolutely true, but the the last time he was at NVidia, they'd shared numbers that showed that at the high end (think 'gaming' GPUs), AMDs market share wasn't anywhere near 30%. In fact, it was in single digits.

    Whether you are green team, red team, or don't care at all, you should find that scary.

    We, cannot, under any circumstances, end up in a situation where there is no competition to drive costs down and performance up.
     
  2. bulldogjeff

    bulldogjeff The modding head is firmly back on.

    Joined:
    2 Mar 2010
    Posts:
    8,403
    Likes Received:
    634
    AMD verses Nvidia or Intel, sounds like they're on a hiding to nothing, but you have to give them 10/10 for effort because they keep trying.

    Microsoft have got the financial muscle to get involved with the CPU/GPU market, I Have often wondered why the haven't had a go.
     
  3. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,163
    Likes Received:
    141
    That's awesome, how did you end up meeting with Wasson?

    Yes bad competition sucks for everyone except the market leader. Loosing a battle on two fronts. As much as I've turned into an nvidia fan boy over the years. If AMD drops out of the market custom PCs will stagnate quite quickly. They seem to be talking the talk on the CPU side in terms of getting back into the high end market. Whether they do or not is a different story.
     
  4. Otis1337

    Otis1337 aka - Ripp3r

    Joined:
    28 Nov 2007
    Posts:
    4,480
    Likes Received:
    117
    Anyone remember Intel Larrabee? that would of been fantastic to keep nvidia on its toes.
    Shame it got dropped.

    Would of looked something like this maybe?
    [​IMG]
     
  5. Yadda

    Yadda Well-Known Member

    Joined:
    25 Jul 2003
    Posts:
    3,217
    Likes Received:
    49
    Last edited: 9 May 2015
  6. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,232
    Likes Received:
    300
    Because designing operating systems is very different from designing graphics processing hardware. Even Intel, who are by anyone's measure the world leaders in CPU design AND fabrication, are still pretty dire at designing GPUs.
     
  7. Madness_3d

    Madness_3d Bit-Tech/Asus OC Winner

    Joined:
    26 Apr 2009
    Posts:
    1,038
    Likes Received:
    34
    And Rys Sommefeldt too :)
     
  8. Corky42

    Corky42 Where's walle?

    Joined:
    30 Oct 2012
    Posts:
    9,302
    Likes Received:
    321
    QFT.
    Based on nothing more than my own perceptions on the current state of AMD is that there very close to coming out of their dark years, that if they make it they'll look back on these years with a sense of relief.

    The reason i think that is because for years (5+) they have been pushing an inherently failed architecture in their CPUs, something reinforced in a news snippet from Mr Halfacree

    "Mark Papermaster admitted that the Bulldozer design - an architecture similar in concept to Intel's failed NetBurst and using a shared-FPU modular design many found troubling in highly-parallel workloads - had some issues regarding performance."

    From my understanding AMD have been struggling along with the equivalent architecture of a P4, and we all know how awful those were, it wasn't until the Core series of CPUs that Intel fixed those problems. I'm hopeful that AMD Zen will see a similar jump, if not more, than when Intel moved from the P4 to the Core series.

    I also believe that the 390x is going to be a taste of their next gen GPU but on current gen fabrication tech, something i believe they have been forced to do due to market forces and share holders. Even though it maybe in short supply i believe it's not really intended to take market share but more as a place holder for what's to come in 1-2 years when they drop to 14nm and HBM2.

    Sorry if all that comes over as wild ramblings or if you don't agree with any of it, but hey there my thoughts on the state of AMD current and future, if they make it to the future that is. :worried:
     
  9. Mister_Tad

    Mister_Tad Super Moderator Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    12,187
    Likes Received:
    691
    There's just no money in discrete consumer GPUs unfortunately - Xeon Phi on the other hand, lots of money in that.
     
  10. Instagib

    Instagib Well-Known Member

    Joined:
    12 Mar 2010
    Posts:
    1,415
    Likes Received:
    57
    I really hope AMD turn it around. I cut my overclocking teeth on AMD's cpus and still fancy them over Intel's Cpu for satisfaction when you get a clock just right.

    Similarly, some of my first serious gpus were Ati and it would be a shame to see them go "down with the ship" as it were.

    I think that is ultimately the problem for AMD; they are fighting on two losing fronts. Maybe their acquisition of Ati was the wrong move back then.

    Hopefully their capture of the console market will give them some respite and allow them to consolidated before more viable offerings can be made elsewhere.
     
  11. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    3,232
    Likes Received:
    300
    There is for some.

    It's going to be interesting to see how new GPU architectures (think release in 2017, with the around 2 year timescale from design start to actual fabbing) change with the advent of VR. There's going to be a massive push for geometry throughput, because poly count is king when you can examine an object a very short distances from multiple angles with ease (see Valve's GDC talk on VR). Pixel throughput will be valuable if you want to start pushing twinned 7680x4320 (e.g.) panels at north of 100Hz. The entire architecture of the processing pipeline may need to change to optimise for latency rather than throughput, loosing some die and power efficiency by 'overbuilding' stages to get operations done in one pass rather than two (and damn the idle time). There will probably be a lot of new API calls added to DX12 and Vulkan for triggering operations based on GPU stages rather than input throughput (e.g. the current dancing around and fancy timing tricks needed to trigger rendering Just In Time for the next VSYNC, again check out Valve's GDC talk).
     
  12. Mister_Tad

    Mister_Tad Super Moderator Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    12,187
    Likes Received:
    691
    Or is there? Just because there's revenue in it, doesn't mean there's money in it, and Y/Y net income shows a slight decline.

    There's Quadro and Tesla in there as well, both of which have much higher margins, then OEM/Mobile parts as well - would be very interesting to get a glimpse of the breakdown.

    I'd hazard a guess that what this forum would consider a mainstream GPU makes up a remarkably small portion of nVidia's profits.
     
  13. theshadow2001

    theshadow2001 [DELETE] means [DELETE]

    Joined:
    3 May 2012
    Posts:
    5,163
    Likes Received:
    141
  14. Ramble

    Ramble Ginger Nut

    Joined:
    5 Dec 2005
    Posts:
    5,585
    Likes Received:
    40
    I'm not sure I'd agree. Take a look at the architecture of Tesla - it's clearly based on an architecture designed to play games and it suggests that when nvidia designs their new cores they're aiming for the gamers and not the computing community.
    Although Tesla has high margins the market size will always be limited and less than that of gamers, not to mention that it is cannibalised by their own gaming GPUs evidence by the intentional crippling of double float performance in the latest drivers, even I've used their consumer GPUs for computation - they're pretty good.
     
  15. Yadda

    Yadda Well-Known Member

    Joined:
    25 Jul 2003
    Posts:
    3,217
    Likes Received:
    49
    I see, cheers. $1.7m?
     
  16. Pete J

    Pete J RIP Teelzebub

    Joined:
    28 Sep 2009
    Posts:
    5,310
    Likes Received:
    315
    I agree with you Parge, there should never be a monopoly. However, since I first started building PCs for myself (8800 GTX days onwards), I've always been an Nvidia fan and am now too used to Nvidia's 'style' to change over. I may as well list why I choose Nvidia:

    • PhysX support: Okay, so Nvidia have now re-enabled PhysX with an AMD GPU present but why should I have to install an additional card for this when I can just have one? It also prevents any potential driver conflicts (and yes, PhysX is important to me).
    • SLI versus Crossfire support: I'm probably going to trigger a lot of rebuttals for this, but it seems to me that Crossfire needs a lot more jiggery-pokery to get working than SLI does. This is not based on personal experience though - merely what I garner on the intertubes.
    • AMD fanboys: Again, another inflammatory comment but AMD supporters really seem to be fanatical about their GPUs (anyone remember AMDAndy :D ). This came to a head for me with the release of the 290X - in the past, Nvidia had been slammed for releasing the hot running GTX 480, but AMD do it and suddenly 'oh, it'll be fine with a non-reference cooler or watercooling' :rolleyes: .

    Ultimately it comes down to this for me: I know Nvidia products work.

    *Prepares for GTX 970 comments*
     
  17. Otis1337

    Otis1337 aka - Ripp3r

    Joined:
    28 Nov 2007
    Posts:
    4,480
    Likes Received:
    117
    the GTX480 was a piss take, AMD cards have never been as bad as that. Nvidia should of never released it in that state.
    AMD have always been about under cutting nvidia to deliver similar performance at the cost of high power consumption and heat.

    If your not that bothered about them 2 factors than AMD is fine.

    My track record has been 4 nvidia cards and 3 AMD cards.
    Nvidia: 7950GT, 9800GTX, 9800GX2, 660Ti
    AMD: 2x 5850 CF, R9 280X

    Generally overall i would prefer to have nvidia but just depends whats best at the time i have the extra cash to spend on a new GPU.

    I didnt have any problems with CF my self but would agree with you it seems to be a little more problematic than SLI on the hole.
    Saying that my 9800GX2 with internal SLI has some scaling issues that where unique to GX2 so tuck nvidia to long to fix it, and so returned the card.
     
    Last edited: 9 May 2015
  18. Parge

    Parge the worst Super Moderator

    Joined:
    16 Jul 2010
    Posts:
    12,868
    Likes Received:
    552
    In rebuttal I would say

    *PhysX games are few and far between, and from what I can see there isn't any support for major releases going forward at all. Hence Nvidia making it entirely open source on software (GPU still locked).

    *I've used SLI (480s) which worked very well. I also had CFX (6950s). Both worked perfectly in terms of setting up, but also suffered from extreme microstutter. NVidia fixed this soon after the 480s, but it took AMD a year or two to catch them up on this. Nowadays there’s not a huge amount of appreciable difference.

    *There are just as many 'fanboys' for both. And being a 'fanboy' for a GPU manufacturer is utterly retarded. What you are doing, is effectively saying “I’m no longer going to choose my GPU on price/performance/features/stability” which should be the ONLY things you base your purchase on. (Ergo, I’m not saying you are a fanboy by the way – you’ve given reasons why you buy Nvidia cards, all of which are thoughtful and fair enough, even if I disagree with some of them).

    *Regarding the 970 'issue' - a load of fuss about nothing. The card they paid for is the card they received.
     
    Last edited: 9 May 2015
  19. Mister_Tad

    Mister_Tad Super Moderator Super Moderator

    Joined:
    27 Dec 2002
    Posts:
    12,187
    Likes Received:
    691
    The Tesla market is significantly larger than "gamers" - hyperscalers, HPC, design and visualisation - all massive scale and huge growth areas.

    The Tesla architecture is based on "gaming" GPU architecture because that's where nvidia grew up and where they've sunk their costs, and it just so happens that they can cater for requirements across the board with what's essentially the same product.

    A quick google turns up some interesting facts and figures here, seems I'm not far off - http://www.theplatform.net/2015/05/08/tesla-gpu-accelerator-grows-fast-for-nvidia/

    Low margin and moderate growth in the gaming sector, high margin and high growth from HPC, hyperscalers, design/visualisation - simply put, gaming GPUs are absolutely not where the priority lies with Nvidia's board any more.
     
  20. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    12,573
    Likes Received:
    1,902
    I'm going a bit off-topic here ("so what else is new," say the peanut gallery), but I find it very interesting that Intel launched NetBurst and spent roughly six years trying to make it a thing before giving up; then AMD, which watched Intel do exactly that and benefited from NetBurst's poor performance compared to its own chips, did exactly the same thing with Bulldozer. Now, roughly six years later - the same length of time Intel was pushing NetBurst - AMD's shifting back to a more traditional design with Zen. I mean, couldn't AMD have just learned from Intel's mistake? Why did it do exactly the same thing, after watching Intel fail miserably?

    Strange stuff, that.
     

Share This Page