1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News China's Hygon chips outed as Epyc in disguise

Discussion in 'Article Discussion' started by bit-tech, 9 Jul 2018.

  1. bit-tech

    bit-tech Supreme Overlord Staff Administrator

    Joined:
    12 Mar 2001
    Posts:
    1,189
    Likes Received:
    19
    Read more
     
  2. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    So if I got this right they effectively licensed it to a majority AMD owned company (so they are selling the license to themselves) who then in turn have an agreement with the actual manufacturer like AMD does with GloFo...

    Sneaky, but if it staves off bankruptcy for another couple years probably the only thing they could have done.
     
  3. edzieba

    edzieba Virtual Realist

    Joined:
    14 Jan 2009
    Posts:
    2,415
    Likes Received:
    118
    There is no way in which this could possibly go wrong.
     
    theshadow2001 likes this.
  4. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,361
    Likes Received:
    184
    Didn't they turn that corner a few Qtr's ago, i thought their now in the black.
     
  5. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    Temporarily by a paper thin margin, but AMD isn't making any where near enough to stay afloat:

    Ryzen has next to no marketshare in Laptops and Office PCs.
    The last hurrah of the Zen arch is launching in 2020, meaning they need a new CPU arch for 2021 which requires them to spend until they are pennyless again.
    GPU mining is all but dead and won't help to keep them afloat.
    They need to bin their GPU arch and come up with a new one from scratch if they ever want to catch up to Nvidia again and Intel is set to enter the dedicated GPU market as well, not even to mention all the G-Sync monitors that are currently selling due to Nvidia being the only game in town, that will lock out plenty of people from switching back to AMD GPUs.
    On top of that AMD has massive debts they need to pay interest on and they have basically no non liquid assets they can sell off any more.
     
  6. fix-the-spade

    fix-the-spade Well-Known Member

    Joined:
    4 Jul 2011
    Posts:
    2,913
    Likes Received:
    128
    When you add up their withheld/deferred dividends, bank loans and other debts AMD are close to $2billion in the hole.

    For the last couple of years they have succesfully negotiated extension or 'reduced maturities' on that debt but sooner or later a creditor is going to demand what they are owed up front, at which point AMD will go bankrupt. Their renegotiations haven't solved anything either, in 2016 they managed to get a $600 million due debt reduced to $149million, but the trade off is that in 2026 they will owe $800million on those same debts, they still owe the outstanding money and they're still paying interest.

    AMD are really quite boned, my guess is they will be looking to sell the company by 2020-2022.
     
    Last edited: 9 Jul 2018
  7. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,361
    Likes Received:
    184
    Wow you two are being a little pessimistic, having said that I've not looked to deeply into their finances so couldn't disagree even if i wanted to.

    Although on the technical front i believe you're being overly pessimistic, Anfield. For starters Intel's been using the same 'core' arch for the last 12 years so i can't see why AMD would want to replace the Zen arch when it's a damn good design.

    Yes their GPU's kind of suck for gaming but that's only a tiny portion of the market and figures from a few months ago show Nvidia's market share declining and AMD's growing so they must be doing something right.

    And lastly concerning Freesync vs Gsync I'd say it's Nvidia who's flogging a dead horse as Freesync is seeing much more wide spread adoption then Gsync.
     
  8. Chicken76

    Chicken76 Member

    Joined:
    10 Nov 2009
    Posts:
    862
    Likes Received:
    21
    Is there a missing 'be'?
     
  9. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,474
    Likes Received:
    762
    Yup - that'll be where the extra "be" from t'other day came from! (Oh, and that's the lede, not the subhead - Bit doesn't use subheads on story pages.)
     
  10. DbD

    DbD Member

    Joined:
    13 Dec 2007
    Posts:
    404
    Likes Received:
    4
    I agree with them being too pessimistic - AMD have looked like going bust for as long as anyone can remember, and they are in a better place now then they have been for quite a while.

    That said their discrete gaming gpu's are looking pretty doomed. Look at the last steam survey - the numbers of modern AMD gpu's (480, 580, vega) are tiny. In the real world it looks as people buy Nvidia to game with, and AMD sold pretty well everything to those big chinese mining operations. This will only get worse with the 11xx series launch and no AMD competition. Not just the extra speed but they'll bring new features that will lock out AMD further (tensor core powered gamesworks stuff almost certain to appear).

    To counter that AMD's cpu market is strong - Intel for all their size seem incapable of pulling off anything these days.

    As for shipping your cpu tech off to china - that's always been a pretty universal way to get ripped off. International law and copyrights get lip service at best.
     
  11. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    AMD has backed itself into a corner when it resurrected the Core Wars, because with the trajectory they are on they will hit just about every scaling limit very quickly. And while there is nothing worn with going to the limit there is no purpose in going past it.
    On top of that AMD has already said they will keep the AM4 platform alive until 2020, combine that with the inevitability of DDR5 and PCI-e 4 (or even 5) both requiring support from the CPU as well it is rather obvious that AMD is planning to introduce some very big changes to their CPUs for 2021.
    Which would be fine, except that the ever higher R&D costs mean AMD doesn't really get any financial respite (which it desperately needs).

    AMD relies on the enthusiast market a lot, go to various online shops that sell PC components and sort CPUs by best selling, you'll see Ryzen doing just there, then look at how thin the presence of Ryzen is in PCs for normies (Dell, HP, Acer etc), its almost all Intel there. Plus as I had already mentioned the very limited presence in Laptops (which are a very large part of the normie market these days).
    The gaming market and the enthusiast market are very closely linked, so AMD needs a good reputation in the gaming market to sell consumer CPUs.
    The problem is that the bad AMD GPUs may well inflict damage to AMDs reputation leading to crossover damage in CPU sales.

    While not entirely accurate (since it excludes all the Chinese LoL players with their RX560s) it is nonetheless telling that AMD has zero entries in the top 10 most popular GPUs:
    https://store.steampowered.com/hwsurvey/videocard/

    On top of that the benefit of AMD being in both consoles hasn't come to pass, remember when people thought games would perform better on AMD GPUs than Nvidia GPUs since AMD is present in three platforms vs the one platform Nvidia has? Yet AMD is still losing hard on the performance front and plenty of games still make use of Nvidia exclusive tech.

    Worse, not only are they behind the competition in GPU performance, catching up to the current competition won't be enough since new Nvidia cards are inevitable and Intel entering the market won't make things any easier.

    While I would prefer it if Nvidia dropped the silly games with the stupid proprietary G-Sync and came into the light adopted freesync instead we all know that people are only paying the premium for G-Sync monitors because AMD GPUs are so uncompetitive. But once you have a G-Sync monitor you are effectively committed to buying future Nvidia GPUs over future AMD GPUs, so the potential market for a hypothetical AMD GPU that is competitive is reduced due to the existence of G-Sync.


    The problem I see with the theory of they have survived against the odds in the past is that in the past they had things they could sell off and staff they could cut, but what do they have left that they can shed if things ever get tight again?

    I want to be wrong, but that nagging doubt just won't go away that easily...
     
  12. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,361
    Likes Received:
    184
    If anything they're more likely to win a core war, it's probably why their pushing the 'you get more cores with AMD' line as they know, as Intel does, that Intel's design is not best suited to high core counts. Without going into to many details Intel 'solution' to the many core problem was to extend its current ring topology into loads of interconnected half-rings so as core count increases so does latency, whereas AMD uses a mesh topology so latencies are more predictable, basically AMD's design scales better than Intel's.

    As for introducing a new socket, DDR5, and PCIe 4 they're not what you'd call big or expensive changes, if they were Intel wouldn't have changed socket every two generations and the change from one DDR or PCIe generation would've been mentioned in the past as needing ever higher R&D costs.

    I mean we're not talking about supporting an entirely new type of memory or different type of bus, at most all they'd need to do is swap out one type of memory controller and/or I/O controller for another.

    As i said the 'enthusiast market' is a tiny percentage of the TAM and yes currently that's where AMD are seeing their biggest gains but that's only because data centres, HPC, OEM's, and 'big' companies don't make million $ decisions on the same time scale as 'enthusiasts', as can be seen in what this article is talking about, it's took 3 years since Xeon's were banned before we, the public, saw any reaction to that change in foreign policy, because these things take time.

    Sure we're not seeing as big a number in sheer revenue terms in the enterprise space as in the 'enthusiast' segments but when looking at Qtr on Qtr gains both segments have seen the same 23% gain.

    I can't really speak much to the competitiveness of AMD vs Nvidia as to be frank I've not paid much attention to GPU's for the last 1-2 years as mining put them so far out of my reach that they may as well not exist. From what I've read of others talking about them though it's not so much that AMD cards are not worth buying it's just their not worth buying at that price.

    Is what I've picked up correct? That AMD GPU prices have remained stubbornly high despite the decline in demand from miners, something about AMD being better at mining? IDK.
     
    Last edited: 10 Jul 2018
  13. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    Yes, AMD is winning the core wars, the problem is what happens when the core wars stops to work as a strategy and given how quickly AMD is scaling the core count the end of the core war is bound to happen soon.
    We may well get 12 or even 16 core Ryzen in the mainstream market and in the HEDT space 64 cores in 2020 can't be ruled out either, but where do you go from there?
    Either they'll have to go back to improving IPC or come up with something all new and that is on top of the new IMC and I/O controller.

    The problem with graphicscard prices is we don't really have a breakdown available of who inflated prices by how much, i.e. how much of the increase is down to AMD / Nvidia, how much is down to Asus, Gigabutt etc, how much is down to distributors and how much is down to shops, that makes it very difficult and the issues with HBM supply constraints impacting Vega don't help either.
    Yes, in general when it comes mining Eth AMD has the upper hand, the problem is that profits have collapsed for reasons that having nothing to do with the graphicscards themselves (on the flip side of course that means AMD and Nvidia can't "fix" the mining profitability issue either).
     
  14. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,361
    Likes Received:
    184
    The only time a 'war' ends is when one side is defeated and as AMD didn't go bankrupt during their dark years in the wilderness and there's next to no chance of Intel going out of business there's not going to be an end to the core 'wars' All that will happen is the same thing we've seen since the inception of the silicon transistor, they'll make them smaller and faster.

    Unfortunately Intel's found out there's some rather big hurdles along that path and despite holding the highest clock speed crown for a long time others are starting to appear in their rear view mirror.
     
  15. MLyons

    MLyons Half dev, Half doge. Staff Administrator Super Moderator Moderator

    Joined:
    3 Mar 2017
    Posts:
    1,936
    Likes Received:
    511
    In reverse order is how the increases were done.
     
  16. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    Except that moar cores doesn't work infinitely even if you are still winning.
    Simply put: As more and more software stops scaling, AMD will have to stop adding cores long before Intel catches up in core count.
     
  17. fix-the-spade

    fix-the-spade Well-Known Member

    Joined:
    4 Jul 2011
    Posts:
    2,913
    Likes Received:
    128
    Well yes, I am an Englishman after all.

    More seriously, when a company's solution to a debt crisis is even larger debts it does not fill me with enthuiasm. I doubt AMD will cease to exist, but I do expect it to change ownership sooner or later.
     
  18. Corky42

    Corky42 What did walle eat for breakfast?

    Joined:
    30 Oct 2012
    Posts:
    8,361
    Likes Received:
    184
    Tell that to the HPC market who buy millions of high core count CPUs so they can have a computers with 4,981,760 cores, like i said the 'enthusiast' market is tiny in comparison, and it's not just China who's willing to buy millions of high core count CPUs.

    As much as we may not like it the 'enthusiast' market isn't much more than a byproduct as AMD's true prize is to get their hands on some of that data centre, HPC, super computer market where companies and governments are willing to spend many millions of dollars on hardware.

    Time for a cup of tea me thinks. :)
     
  19. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,474
    Likes Received:
    762
    I always seem to have to debunk this one: mainstream is by far the most important market to a semiconductor company. Let's look at Intel, because it has the lion's share of the market and is thus the most representative company: here's its last annual report so you can follow along at home.

    The company's Data Centre Group (DCG, responsible for everything from Microservers to supercomputers) booked $19.1 billion in revenue for the year, up a nice 11 percent year on year. Not bad, right? Well, its Client Computing Group (CCG, responsible for desktops, laptops, tablets, and the rest of the mainstream market, and including its enthusiast HEDT products) booked $34 billion despite only 3 percent growth year-on-year. Even during the fourth quarter of 2017, when Intel's data-centric business units (which includes DCG, IoTG, NSG, and PSG all rolled into one) hit an all-time revenue high, it accounted for less than half (47 percent) of Intel's revenue. Mainstream is, and always has been, the bulk of Intel's business.

    Yes, a supercomputer customer will buy a hojillion chips from you. Trouble is, there aren't that many supercomputer customers - and you'll make more money selling individual chips to the eighteen hojillion end-users in the market than you will from the supercomputer market, every single time. The supercomputer customer also won't come back next year for more, which ain't great for your recurring revenue.
     
  20. Anfield

    Anfield Well-Known Member

    Joined:
    15 Jan 2010
    Posts:
    3,639
    Likes Received:
    169
    To make matters better / worse (depending on if you are a customer or manufacturer) there are more competitors around as well than in the desktop space, IBM Power9 Talos II etc...
     
    Gareth Halfacree likes this.
Tags: Add Tags

Share This Page