1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News AMD unites CPU and GPU development teams

Discussion in 'Article Discussion' started by CardJoe, 7 May 2009.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,346
    Likes Received:
    316
  2. D-Cyph3r

    D-Cyph3r Gay for Yunosuke

    Joined:
    31 Aug 2008
    Posts:
    925
    Likes Received:
    41
    Fusion baby.
     
  3. Star*Dagger

    Star*Dagger What's a Dremel?

    Joined:
    30 Nov 2007
    Posts:
    882
    Likes Received:
    11
    I have an ATI Radeon HD 4870x2. Riddle me this, how can you possible put that on a motherboard?!
    If it is doable, do it, if not, I am happy to buy gpu and cpu separately.
    I daresay that the integrated CPU/GPU will be for the low-end casual gamer while the Big Boys will still be taking the separate cpu from gpu approach. I cant see how they would cool such a beast, if it was a high-end solution.

    S*D
     
  4. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    I think they are bang on the money here.

    Knowing how long it takes to develop discrete graphics cards (or any other form of high-end chip), i think we can safely assume that the HD 6xxx series is already on the drawingboard. after that, i think that miniturisation has come to a point where integrating CPU and GPU will offer vast advantages as opposed to building the two separately. I think that by the time the first CPU/GPU unit from AMD hits the market, they'll have built a complete high-end PC on an mATX sized piece of silicon.
     
  5. seveneleven

    seveneleven What's a Dremel?

    Joined:
    26 Dec 2007
    Posts:
    30
    Likes Received:
    0
    Folks in the 70s: A seperate controller for graphics?! Surely there will never be a need for such a thing!
     
  6. bowman

    bowman Minimodder

    Joined:
    7 Apr 2008
    Posts:
    363
    Likes Received:
    10
    Of course it's just integrated. Fusion will be the equivalent of 780G, just on the CPU instead.

    Ssh, don't let the marketing people hear you, though. TEH FOOTOOR IS FUUSHUN!
     
  7. lp1988

    lp1988 Minimodder

    Joined:
    24 Jun 2008
    Posts:
    1,288
    Likes Received:
    64
    in the first many years we will properly only see these in labtops or PC's for buisnesses.
     
  8. cheeriokilla

    cheeriokilla What's a Dremel?

    Joined:
    18 Feb 2009
    Posts:
    107
    Likes Received:
    1
    Awesome news! I have two rigs... This could put some interesting stuff in their future devolpment
     
  9. JyX

    JyX What's a Dremel?

    Joined:
    14 Apr 2009
    Posts:
    42
    Likes Received:
    0
    Think of this... a GPU on CPU that works in WDDM (Windows) but in games, you can switch to dedicated graphics... that's the idea. Now, since the CPU is properly cooled, compared to the northbridge which most of them are passively cooled... a GPU could yield better performance in HD decoding and GPGPU applications, not to mention that powering off the discrete GPU could allow for overall lower power usage.

    This could also free the chipset development for not needing to make separate IGPs also, just standalone chipsets. This would be beneficial on mobile platforms and also on servers.

    Besides, it's inevitable... Intel markets Westmere as a CPU-GPU multi packaging solution so people will follow and demand the same thing from AMD, even though AMD's should perform better at a lower power draw... based on the 780G/790GX TDP of 11W compared to IGP's from Intel of 28W.
     
  10. mayhem

    mayhem Owner of Mayhems

    Joined:
    12 Sep 2008
    Posts:
    816
    Likes Received:
    44
    What would be good would be a motherboard with a socket for the cpu and a socket for the GPU then you have scale of GPU's as well as CPU's and you just throw in what ever GPU you fancy .....
     
  11. EvilRusk

    EvilRusk What's a Dremel?

    Joined:
    23 Jan 2006
    Posts:
    110
    Likes Received:
    2
    Why stop there, you could put the GPU with it's own dedicated ram on it's own board and plug it in to a slot... oh wait...
     
  12. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    this isnt anything really new, I mean in a way this old news, everyone knew this is what AMD was heading for to begin with but nice to see them cracking down and really pushing forward with it .
     
  13. JyX

    JyX What's a Dremel?

    Joined:
    14 Apr 2009
    Posts:
    42
    Likes Received:
    0
    That was before their stocks degraded to the current levels... and as the articles says, "AMD is now focusing its efforts on developing CPUs with integrated graphics, which was the original goal of the Fusion project before it got hijacked by the marketing guys.". They turned FUSION into some application suite that optimizes current gen AMD platform for games... when in fact it's this.
     
  14. Kudos

    Kudos What's a Dremel?

    Joined:
    18 Apr 2006
    Posts:
    7
    Likes Received:
    0
    Granted, in the early stages after (if?) this happens it'll be low end graphics... business/granny checking emails ect.

    But whats to stop them from, say, dropping 2 cores from a quad and replacing them with 2 graphics cores as the tech develops? Heat will surely be an issue, but this would likely be an enthusiast gpu/cpu, so they expect watercooling at the very least.

    Could be interesting to watch how it devolps
     
  15. DaMightyMouse

    DaMightyMouse What's a Dremel?

    Joined:
    1 Aug 2008
    Posts:
    49
    Likes Received:
    2
    LOL!
     
  16. Crazy Buddhist

    Crazy Buddhist Minimodder

    Joined:
    22 Aug 2007
    Posts:
    105
    Likes Received:
    2
  17. Turbotab

    Turbotab I don't touch type, I tard type

    Joined:
    4 Feb 2009
    Posts:
    1,217
    Likes Received:
    59
    It would be ironic if Intel after spending a fortune on Larrabee, end up with products with a superior CPU, yet vastly inferior GPU offering, just like the days of Intel GMA.
     
    Last edited: 19 Nov 2009
  18. Crazy Buddhist

    Crazy Buddhist Minimodder

    Joined:
    22 Aug 2007
    Posts:
    105
    Likes Received:
    2
    Turbo
    ...

    I think that is the way it will be. Their knowledge of leveraging parallel processing was certainly improved on the software level when they bought Havok but on the hardware side of parallel processing they are kids compared with both AMD and Nvidia due to their competitors outstanding GPU teams.

    It would not surprise me to see Intel hit a technology limit (wafer's can only get so small) that slows their leading edge, only to be surpassed in the mid term by better integration technologies from Nvidia and AMD/ATI. 2 - 3 years from now everyone might not be buying Intel.

    Matthew
     
Tags: Add Tags

Share This Page