1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD to lay off 10 per cent of its workforce

Discussion in 'Article Discussion' started by Claave, 4 Nov 2011.

  1. mediapcAddict

    mediapcAddict New Member

    Joined:
    8 Sep 2010
    Posts:
    97
    Likes Received:
    2
    sorry but amd processors is just dying.

    heres my reconing on why llano is failing.

    Fistly llano is failing because the graphics performance is very dependant on fast memory. AMD should have made an optional seperate lane for faster ddr3 graphics memory . that would have kept the cost of a build down.

    secondly they should have had it released to market with seperately configurable overclocking for gpu and cpu. you have to spend £160+ with intel to get overclocking. intel are gifting the whole budget overclocking market to amd and amd does nothing.

    thirdly they should have had a faster graphics processor than the 6550D. 6550d isn't good enough for a dedicated gamer or even media pc ( imo ) and who else knows which graphics cards matter outside of gamers and few others.

    admit it who saw llano and thought "yeah 6550d thats my next build". had been a 6850 then they would had a chip to challenge intel. and a £40 cooler to including the retail packaging BUT none the less they would be back in the game.

    it sucks that people are getting laid off but I can't say I'm shocked. just saddened
     
  2. The Infamous Mr D

    The Infamous Mr D Member

    Joined:
    11 Feb 2008
    Posts:
    147
    Likes Received:
    0
    Given that desktop sales and laptop sales are considerably down, it's no surprise that 'traditional' processors aren't as much in demand. People are upgrading less, buying less new equipment and making do with their current kit much longer. Aside from gamers, tablets and smartphones are soaking up most of the casual technology sales and usage. I'm still running an overclocked Q6600 with an HD5870 for over 18 months, it's doing me just fine, and I'm not planning an upgrade until it's really necessary.
     
  3. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    You can't blame Intels, how have some put it "Intel's "dirty tricks" with OEM's is leaving a very sour taste now - for everyone..." that's old news which Intel was forced to pay many $Millions to AMD after anti-competitiveness in the US and Europe.

    The problem now is purely in AMD's court. If they lived up to their marketing departments hype, there wouldn't be a problem now. Under promise and over deliver, it's a simple strategy AMD, you may as well try it since nothing else is working.
     
  4. Bede

    Bede Well-Known Member

    Joined:
    30 Sep 2007
    Posts:
    1,340
    Likes Received:
    40
    I'm seeing a lot of lazy posting in this thread. One of the most amazing things humans create is the desktop CPU. They are now so outrageously complex and do so many things that we should be amazed that we even have them at all.

    I am no fanboy for AMD - I like Intel chips and nVidia graphics cards - but to say that AMD are stupid is ridiculous; their engineers are good. They don't have as much money as Intel to sink into R&D, and with something as complex as a new microchip architecture that is where the difference is made - one man does not design a chip.

    It is sad that they have to make so many people redundant however, with any luck, this will give them more money to put towards R&D.
     
    Last edited: 5 Nov 2011
    Paradigm Shifter and Lenderz like this.
  5. Lenderz

    Lenderz Member

    Joined:
    4 Nov 2010
    Posts:
    380
    Likes Received:
    15
    Well said, you managed to put what I was trying to say much more eloquently thank you sir. :clap:
     
  6. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. New Member

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    AMD CPU's suck! End of story.
    they were in the game for a while offering good performance at a good price but now Intel has a $200 CPU that kills anything AMD has ever made.

    AMD GPU's are still a great value but soon (unless they get lucky and Apple buys them) AMD will be a GPU only company. I'm sure those laid off can get jobs at Nvidia Tegra or ARM.
     
  7. rogerrabbits

    rogerrabbits New Member

    Joined:
    24 May 2011
    Posts:
    577
    Likes Received:
    11
    So in reality, "we are seriously struggling :/"
     
  8. fluxtatic

    fluxtatic New Member

    Joined:
    25 Aug 2010
    Posts:
    507
    Likes Received:
    5
    I think, if they get through to the second generations of Llano and BD, they'll be all right. Don't get me wrong - god knows I posted a couple times here how disappointed I was with the BD release. However, both archs show a lot of promise. Win 8s optimizations will help immensely - it had been benchmarked and wasn't that impressive, but keep in mind it's the Developer's Preview. I had a chance to play with it the past few days, and it's pretty rough. Vishera (2nd gen BD) should be much better than BD has been so far. Same with Llano. Trinity will be based on the Piledriver core, same as Vishera, with a 6xxx GPU (the current Llano uses the Redwood core, so it's really a 5xxx GPU.) Vishera will also have a competitor for Quick Sync. Ripping off Intel? Maybe, maybe not. All I know is, I want fast hardware transcoding, but I won't buy Intel to get it.

    Not to call Lenderz wrong, but I doubt now Nvidia's looking for an x86 license. If they really wanted it, they would buy Via - they've got the cash to do it. But why jump into that? They have one of the most popular ARM processors available today. They'll also be first to market with a quad-core ARM (it's actually got five cores - interesting reading if you haven't seen it yet.) Unless you've got brilliant x86 engineers, it's not worth it trying to compete with Intel in that arena. Nvidia obviously knows what they're doing with ARM, so why go another new direction now?

    This news just sucks, though. Economy the way it is all over the world, I feel for these people. I wonder what positions are 'redundant', though? (God I hate that term - just tell us you're laying them off. 'Being made redundant' sounds even worse...although then it kinda sounds like it's not your fault, right? If you're laying them off, you must be struggling. If they're 'redundant', it's their fault for being useless duplicates, yeah?)
     
  9. Snips

    Snips I can do dat, giz a job

    Joined:
    14 Sep 2010
    Posts:
    1,940
    Likes Received:
    66
    If AMD don't have the money to do R&D properly, then quit what you are crap at and only focus on what you are good at. Remind me again, what's AMD good at?
     
  10. Marvin-HHGTTG

    Marvin-HHGTTG CTRL + SHIFT + ESC

    Joined:
    10 Oct 2010
    Posts:
    1,187
    Likes Received:
    58
    I can only assume you're deliberately trolling here, same as whoever thought Llano should have had a 6850 onboard.

    Consistently over the last few generations AMD's GPUs have been fast, well priced, and fairly frugal. Nvidia's have been generally slightly faster, but more expensive, hotter and with frankly ridiculous power consumption.

    To many that's far more important than having the absolute fastest card, as system builders would testify.Hence why AMD's GPU market share is now well above Nvidia's for the first time.
     
  11. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,077
    Likes Received:
    37
    Agreed.

    ...

    Other than that, it's refreshing to see that someone at Bit-Tech understands the correct meaning of 'decimation' - it is used wrongly in far too many places... the mainstream media being particularly guilty of it over the last 15 years or so. :)
     
  12. Xir

    Xir Well-Known Member

    Joined:
    26 Apr 2006
    Posts:
    5,250
    Likes Received:
    88
    Well, thats what I meant, they already reduced themselves to R&D and Chip Design
    (since manufacturing and manufacturing R&D are at GloFo)
    Who are they reducing now that doesn't hurt their R&D or Chipdesign?

    Hmmm, possibly, all manufacturers seem to have overhead in the BS-department:thumb:
     
  13. Guest-16

    Guest-16 Guest

    I'm waiting for the announement on the 9th. I highly suggest people read the Icrontic and Anandtech news posts on this too, and I would check to see if AMD exclusive GPU partners are nervous..
     
    Bede likes this.
  14. Bede

    Bede Well-Known Member

    Joined:
    30 Sep 2007
    Posts:
    1,340
    Likes Received:
    40
    That is a little disturbing, AMD have been really winning on the GPU front recently (as far back as the 5870 imo), I hope they don't do anything stupid.
     
  15. Lenderz

    Lenderz Member

    Joined:
    4 Nov 2010
    Posts:
    380
    Likes Received:
    15
    Sorry, perhaps I wasn't clear, long before Nvidia started playing with ARM (I'm talking 5-10 years ago or so) there were a series of rumours that Nvidia wanted in the X86 market and was trying to get hold of a licence. Nothing ever came of these rumours, but Intel is notoriously difficult to get to play ball in this regard. I did wonder if something may come of this recently when Nvidia and Intel signed a deal sharing certian technologies but nothings happened in that regard.

    Anyway, I wasn't saying Nvidia is looking for the licence, just that I feel that we need competition in the X86 market thats all, and we need a capable team of engineers with a decent R&D budget to keep Intel on their toes, AMD did that for a long time where Cyrix and VIA didn't manage it. I just don't see anyone else other than Nvidia being able to move into the market, and I don't see them wanting to either with their concentration on ARM, I was trying to say "if that was a move they were going to make, they should have made it 5-10 years ago".

    I wonder if Intel might prop up AMD for a while just so that it has a competitor like MS did with Apple all those years ago.


    Edit
    An example of the rumours I mention :

    http://www.bit-tech.net/news/hardware/2009/03/04/nvidia-reveals-plans-for-x86-cpu/1
     
    Last edited: 5 Nov 2011
  16. Arghnews

    Arghnews New Member

    Joined:
    6 Jul 2011
    Posts:
    129
    Likes Received:
    4
    There have been examples in the past where architectures have improved hugely with updates. Hopefully Bulldozer can match this.

    The main problem, though, is Bulldozer a server architecture. It's just not designed for the consumer market. Simple.

    Someone mentioned the possibility of Windows 8 utilising the threads better, but, the problem is games. It's so difficult to design games to utilise 8 cores; more raw processing power with less cores>less power with more cores, IMO.
     
  17. rogerrabbits

    rogerrabbits New Member

    Joined:
    24 May 2011
    Posts:
    577
    Likes Received:
    11
    Well AMD have ups and downs. For a while nobody cared about them, and then they released the first mainstream dual core which was pretty popular. Then Intel took the lead again with that Conroe, and then I think AMD took the lead again in low energy chips for HTPC's and stuff. So this could just be another slump and maybe they will take the lead again soon. Or maybe this is the beginning of the end.
     
  18. Aragon Speed

    Aragon Speed Busily modding X3: Terran Conflict

    Joined:
    12 Jan 2009
    Posts:
    168
    Likes Received:
    1
    TBH the only thing that lets BD down is its low(er) IPC as far as I can see in most reviews. An improvement in that area alone for the next chip should see it pulling it's weight in the desktop arena imo.
     
  19. Guest-16

    Guest-16 Guest

    It's not stupid from a business perspective. This man is all about saving cost and improving profitability. Look at the list of people who have left/were made redundant recently and look at the ROI of the GPU market in the last decade. The trend path is not good for graphics.

    Even if you look at the trend Nvidia and AMD have been pushing publically the last two years - it's ALL to do with general computing first, graphics second (it's just that graphics is more popularised in media due to its relevance in our audience). The two have played friendly so far, but very soon I expect they will need to diverge as the GPGPU market grows and makes it more worthwhile than chasing gamers. In that sense AMD could get away with only making "average" low power CPUs with IGPs to act as IO hubs to far more powerful GPGPU PCIe cards in servers.
     
  20. Bede

    Bede Well-Known Member

    Joined:
    30 Sep 2007
    Posts:
    1,340
    Likes Received:
    40
    At the same time though Bindi (and I'm not really arguing with you as tbh my knowledge of the market is really quite limited :D) the market for discrete desktop graphics has grown. I think nvidia's marketing before BF3 was very interesting, it seemed to me to be they were testing how much worth there is left in the desktop market.

    We may yet see what you fear come to pass, and the consumer market be offered rubbish graphics cards, but I think it would be a fool who ignored the potential of our market - after all, those who can afford an £800 computer can probably afford to (and are interested in) upgrading their graphics card every year or two.

    There is also the next generation of consoles to consider - if they can push console graphics to current desktop PC levels then our systems will need to get more powerful as the absence of direct-to-metal coding on PC means we are a lot more inefficient. I'm actually quite looking forward to seeing how the new gen of consoles are put together :)
     
Tags: Add Tags

Share This Page