1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD says Fusion CPU and GPU will ship this year

Discussion in 'Article Discussion' started by Sifter3000, 15 May 2010.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
  2. Teq

    Teq What's a Dremel?

    Joined:
    11 Aug 2008
    Posts:
    95
    Likes Received:
    0
    I'm keeping an eye on this project, it could drop the cost of a HTPC a little with possible performance gains, to early to say though but I'm optimistic :)
     
  3. MrGumby

    MrGumby CPC 464 User

    Joined:
    27 Apr 2009
    Posts:
    1,437
    Likes Received:
    30
    Surely this whole CPU/GPU package concept is best consigned to the laptop/HTC market?
     
  4. azrael-

    azrael- I'm special...

    Joined:
    18 May 2008
    Posts:
    3,852
    Likes Received:
    124
    Well, Fusion (and Fusion-like tech) will definitely spell the end for integrated graphics. Apart from that I can't quite see what kind of impact it'll have on computer systems. It'll probably make it cheaper to do entry-level systems, though. Right now, I'm mostly having a "meh" moment.
     
  5. NuTech

    NuTech Minimodder

    Joined:
    18 Mar 2002
    Posts:
    2,222
    Likes Received:
    96
    I can definitely see a use case for this technology in desktop PCs.

    If they make a great chip for gaming that allows you to disable the on-board GPU, when it comes time to upgrade your CPU/motherboard, you can enable the GPU and turn it into a server rig or second computer.

    Actually I would like to see more motherboard manufacturers integrate graphics on their high end products for the same reason.
     
  6. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    Am I excited? Hell yes!

    This is the new FPU. Integrating a faster, more capable math crunching unit into all AMD CPUs. If developers can rely on all CPUs having decent GPGPU capabilities, can you imagine how software and it's use will change the way we use the computer?

    Best thing? Intel will be FORCED to create graphics that doesn't blow!
    Second best thing? AMD has been saying that the Fusion graphics core will be annually updated, so the core will be near current. No more ancient GMA950 in your netbook.
     
  7. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    I bet the were many people who were going "Meh" when the FPU was being integrated, but where would you be now without one?

    I think too many people are looking at this as an integrated graphics core, and not as the stream-processing core it is.
    The fact is we still don't really know where we are going with GPGPU and what we can do with it. It isn't all HD encoding and transcoding. I think that the brake on development has been the poor state of integrated graphics (ie. Intel) in 75% of systems.

    Imagine every computer, from bottom to top, having a capable GPGPU core? Software developers would be able to count on it being there, just like they can count on a x86 CPU having a FPU.

    On a side note: what of Nvidia's GPGPU strategy when every full-fat Opteron has one or more of these cores on die? What of Intel's when such Opterons are spanking the Xeons in database operations (Larrabee to the rescue...)?

    I think this is the most exciting CPU development in a long while. The fact that we don't really know what is going to happen is great. It means that there is room for this to change everything, not just a speed bump or a process shrink.

    Or maybe it will fall flat on its face? :confused:
     
  8. rickysio

    rickysio N900 | HJE900

    Joined:
    6 Jun 2009
    Posts:
    964
    Likes Received:
    5
    Intel's current batch of graphics are already on the level of AMD's.

    I do wonder whether SandyBridge will launch earlier or not.
     
  9. LightningPete

    LightningPete Diagnosis: ARMAII-Holic

    Joined:
    2 Jul 2009
    Posts:
    307
    Likes Received:
    0
    AMD intergrated graphics solutions usually are better performing chips than Intels. And AMD HD3200 integrated chip versus the X4500 Intel chip for example.
    High end part of fusion could see an entry level gaming system?
    Would be nice to get entry level systems down in price though. System builders are still charging like 350-500 for basic level systems.
     
  10. Guest-16

    Guest-16 Guest

    HD 3200 versus X4500 wasn't far off, only drivers separated them but in terms of video playback Intel's ClearVideo is fantastic.

    As for GMA-HD, imo it's ahead of the latest 880G from AMD overall, so I hope AMD pulls something great out the bag with the Fusion GPU core.
     
  11. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    7,226
    Likes Received:
    1,784
    Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.
     
  12. firex

    firex What's a Dremel?

    Joined:
    15 May 2010
    Posts:
    1
    Likes Received:
    0
    Why are we talking about the Fusion GPU core as if it's like an integrated graphics core? I'm pretty sure I've read time and again that it will be using the ATI 4000 or 5000 series core...that would definitely beat GMA-HD or any integrated graphics solution for the foreseeable future...

    AMD's approach to build X cores on single silicon brings lots of advantages (on paper). However, the fact that the original Phenoms lag behind Core 2 Quads is because of AMD 'reusing' the old K8 architecture...whereas Intel uses a brand new architecture in Core 2. And AMD's very late launch of Barcelona make the performance difference look worse than it actually was.
     
  13. aussiebear

    aussiebear What's a Dremel?

    Joined:
    13 Nov 2008
    Posts:
    36
    Likes Received:
    8
    Well, let's look at AMD's first Fusion processor: Currently codenamed Llano.

    From what I know...

    (1) It will be aimed for the mainstream. In fact, it replaces the Athlon II line in 2011. Which suggests it will be reasonably affordable by many.

    (2) Based on a highly tweaked version of the Phenom II for its CPU part. (32nm process). They've dropped the L3 cache, and upped the L2 cache to 1MB per core. It will start from 3Ghz or higher. And it will be coming in dual, triple, and quad-core versions; operating at 0.8v to 1.3v.

    (3) Will introduce power gating (similiar to that of the Core i-series) and other power saving features. I hear the whole processor is rated to have a TDP of 20W to 59W. (Starts at 20W for Notebook versions; while Desktop versions will start at 30W.)

    (4) The IGP element of the processor is said to be based on Radeon HD 5xxx consisting of 400 stream processors. So I'm guessing we can expect Radeon HD 55xx to 56xx performance from it. Somewhere around there.

    (5) It will require a new motherboard, as the entire northbridge is now on the CPU. The motherboard will only house the "Hudson-D" southbridge.


    I'm excited for a number of reasons.

    * It sets the first step for an affordable heterogeneous processor that can actually do GPGPU work.

    Intel's HD Graphics (found in current Clarkdale CPUs) is really an enhanced X4500 series IGP. It offers very little GPGPU capability. Intel's next generation "Sandy Bridge" uses an enhanced version of the current HD Graphics found in Clarkdale. So again, it has little GPGPU capability; but it will be very good for HD playback role. (As that is what Intel is focusing with their IGPs.)

    ...And while 2nd generation Larrabee is still being worked on, (as the first generation has missed its window of opportunity); I doubt we'll see an IGP variant until 2yrs+ later.

    * This processor would be perfect for OpenCL. (As that doesn't care what type of processor is available; as long as it can be used.)...ATI's Stream SDK for software developers is being improved to support Llano for a reason. ;)

    * Its also the first step to gradually reduce the FPU in favour of GPU-like stream processors. AMD's 2nd generation (2015?) will actually combine core elements of GPU into CPU. There won't be any distinct GPU and CPU modules in the future.
    => http://www.xbitlabs.com/news/cpu/di...tion_of_AMD_Fusion_Chips_Due_in_2015_AMD.html

    The way the "Bulldozer" architecture is arranged, I'm guessing AMD will eventually replace the K10.5 cores in Llano with "Bulldozer" in the next 2 years.

    * While I don't expect Llano to best "Sandy Bridge" (let alone the current Intel Clarkdale processors) in a clock-for-clock manner as its still based on the Phenom II; I do expect that AMD will raise the bar to IGP performance. It means Intel is going to have to up their IGP game...Result? End-users will benefit from improved IGPs! (Game developers will have more room to play with!)

    * AMD makes a better attempt at addressing its fundamental issue for the mobile market...Power consumption and resulting battery life.

    ...While I don't expect it to match Intel's notebook solutions in battery life; I do expect to see a notable improvement over the current AMD based notebook solutions.

    * Assuming AMD follows the current pricing trend they have with the Athlon II line; AMD's first Fusion processor will be affordable. It'll be a stepping stone to encourage software developers to start looking into using OpenCL, DirectCompute, etc in a more serious perspective.

    And lastly...
    * I'm still hanging on to this dinky little single-core 1.6Ghz@2.0Ghz Sempron (Socket 754, 65W).

    I want to upgrade it to quad core (at least 3.2Ghz) that is rated at 45W TDP. :)

    I think its possible with 32nm; given that AMD will release Athlon II X4 615e by the end of this year. (That's 2.5Ghz quad-core; rated at 45W TDP.)
     
    Last edited: 15 May 2010
  14. aussiebear

    aussiebear What's a Dremel?

    Joined:
    13 Nov 2008
    Posts:
    36
    Likes Received:
    8
    Because AMD designs processors for the server/supercomputing roles. Integration is especially important where you start scaling up to 4, 8, 16, etc processor sockets.

    These features don't mean crap to the typical desktop user because:
    (1) They only use 1 CPU socket.
    (2) They don't use their computers in an intensive manner such that it requires huge bandwidth.

    Intel knows this, so its cheaper/quicker to slap together things and push it into market.

    AMD tries to design things elegantly from an engineering standpoint; as they don't have the resources to throw around. (With the K8/K10/K10.5 series; they made one architecture, and then trickled it down to different markets.)

    Where as, Intel dumps huge engineering resources/talent and brute forces a solution with the best features they can shove in. (Then they pull them out again to address the affordable/low-end markets.)...Of course, they also have enough resources to accommodate multiple architectures at the same time.

    AMD couldn't do this previously. But that looks to change in 2011.
    Low end (netbook/nettop) => Bobcat
    Mainstream (Desktop/Notebook) => Llano
    Enthusiast/Performance/Workstation/Server => Bulldozer
     
  15. StoneyMahoney

    StoneyMahoney What's a Dremel?

    Joined:
    10 Jul 2009
    Posts:
    287
    Likes Received:
    13
    Intel's decision to integrate CPU and GPU into the same package cemented the Intel 1-2 combination in the future of every PC sold to a business for the next god-only-knows-how-many years. That's where the real money is - it's how Intel sold the overwhelming majority of all GPUs last decade - and every initiative AMD has come up with to crack into the volume corporate market has only got as far as a brief flirtation.

    How much of that is up to performance economics and how much is up to Intel being naughty (and thus pulling in a world record-breaking anti-competition fine in the EU courts) is questionable, but the fact remains that AMD has been hopping along behind Intel for some time now and can do nothing but react to Intel's releases.

    Integrating the CPU and GPU does nothing significant to performance, so it's purely a business/economics decision. Until some killer must-have GPU-accelerated business app appears (I'm thinking some kind of real-time business intelligence analysis software?) then GPU performance will continue to be irrelevant to the majority of the PC market. Even when something does turn up, how will AMD take advantage of it's appearance when the difference in performance between Intel's and their own integrated GPUs is so marginal, especially compared to the performance of an add-in card?
     
  16. Arj12

    Arj12 What's a Dremel?

    Joined:
    6 May 2010
    Posts:
    106
    Likes Received:
    0
    Well seeing as the CPU and GPU is going to be on a smaller die compared to intel's current ones (well the GPU anyway!) this should mean the chip will be more power efficient and produce less heat =D Can't wait for the release now as I am in the market for a new laptop soon!
     
  17. Autti

    Autti What's a Dremel?

    Joined:
    29 Oct 2009
    Posts:
    152
    Likes Received:
    3
    Sorry but what is the difference between having the two chips on one piece of silicon, compared to have the two chips on two different pieces but still linked together.
    I don't get why this is so big... in fact unless there is an interface boost it is a very bad idea as its more expensive.
    failure rate of the gpu creation and cpu creation will now be combined in a single piece of silicon, where as with intel, each parts are independent and hence create higher yields during fabrication,
     
  18. rickysio

    rickysio N900 | HJE900

    Joined:
    6 Jun 2009
    Posts:
    964
    Likes Received:
    5
  19. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,956
    Likes Received:
    17
    This has me worried. All AMD want to talk about is the integrated GPU. I thought part of fusion was that they'd be ditching their K8 (or at least K8 circa 2003 derived) architecture, and bringing out a totally new one. The fact that they're not talking a lot about a new architecture makes me think there isn't one. Or if there is, it's not good enough to compete with Intel's Nehalem. Either way it's bad news for AMD and consumers.
     
  20. javaman

    javaman May irritate Eyes

    Joined:
    10 May 2009
    Posts:
    3,984
    Likes Received:
    186
    Im excited about this but Im worried about upgrades. While integrating basic graphics into the CPU is a great idea for lower power usage, Come higher end gaming machines if you want more GPU horse power you have to upgrade the whole processor. I don't feel total integration is the way to go. I also wonder if these will also offer a similar thing to hybrid crossfire.
     
Tags: Add Tags

Share This Page