bit-tech.net

Go Back   bit-tech.net Forums > bit-tech.net > Article Discussion

Reply
 
Thread Tools
Old 15th May 2010, 11:10   #1
Sifter3000
I used to be somebody
Moderator
 
Sifter3000's Avatar
 
Join Date: Jul 2006
Location: The South Sea Bubble
Posts: 1,766
Sifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the DremelSifter3000 has yet to learn the way of the Dremel
AMD says Fusion CPU and GPU will ship this year

http://www.bit-tech.net/news/hardwar...ip-this-year/1

__________________
The Rt Hon Alex Watson.
Dennis Publishing Emerging Platforms, bit and CPC alumni.

My weblog is The Wired Jester.
Photos on Flickr.
Sifter3000 is offline   Reply With Quote
Old 15th May 2010, 11:35   #2
Teq
Multimodder
 
Join Date: Aug 2008
Posts: 93
Teq has yet to learn the way of the Dremel
I'm keeping an eye on this project, it could drop the cost of a HTPC a little with possible performance gains, to early to say though but I'm optimistic
Teq is offline   Reply With Quote
Old 15th May 2010, 11:36   #3
MrGumby
CPC 464 User
 
MrGumby's Avatar
 
Join Date: Apr 2009
Location: Wakefield
Posts: 1,155
MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.MrGumby is the Cheesecake. Relix smiles down upon them.
Surely this whole CPU/GPU package concept is best consigned to the laptop/HTC market?
MrGumby is offline   Reply With Quote
Old 15th May 2010, 11:37   #4
azrael-
I'm special...
 
azrael-'s Avatar
 
Join Date: May 2008
Location: Aarhus, Denmark
Posts: 3,805
azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!azrael- - it's over 9000!!!!!!!!1!1!1!!!
Well, Fusion (and Fusion-like tech) will definitely spell the end for integrated graphics. Apart from that I can't quite see what kind of impact it'll have on computer systems. It'll probably make it cheaper to do entry-level systems, though. Right now, I'm mostly having a "meh" moment.
__________________
It's not the end of the world ...but you can see it from here.

Xeon E3-1245 v2 3.4 GHz | ASUS P8C WS | 16GB Kingston ECC DDR3 | Samsung 830 256GB | eVGA GTX670 2GB | SB Audigy | Seasonic X-560 | Corsair Obsidian 550D | Dell 2209WA
azrael- is offline   Reply With Quote
Old 15th May 2010, 11:46   #5
NuTech
Mod Master
 
NuTech's Avatar
 
Join Date: Mar 2002
Location: London
Posts: 2,222
NuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run CrysisNuTech can run Crysis
I can definitely see a use case for this technology in desktop PCs.

If they make a great chip for gaming that allows you to disable the on-board GPU, when it comes time to upgrade your CPU/motherboard, you can enable the GPU and turn it into a server rig or second computer.

Actually I would like to see more motherboard manufacturers integrate graphics on their high end products for the same reason.
__________________
NuTech
NuTech is offline   Reply With Quote
Old 15th May 2010, 11:46   #6
Adnoctum
Kill_All_Humans
 
Adnoctum's Avatar
 
Join Date: Apr 2008
Posts: 482
Adnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for president
Am I excited? Hell yes!

This is the new FPU. Integrating a faster, more capable math crunching unit into all AMD CPUs. If developers can rely on all CPUs having decent GPGPU capabilities, can you imagine how software and it's use will change the way we use the computer?

Best thing? Intel will be FORCED to create graphics that doesn't blow!
Second best thing? AMD has been saying that the Fusion graphics core will be annually updated, so the core will be near current. No more ancient GMA950 in your netbook.
__________________
Main Rig: Amiga A1200 - Motorola 68EC020@14.2MHz + 68030@50MHz Acc. Card - Lisa Graphics - 2+8MB RAM - 80MB HDD
LAN Rig: Amiga A500 - Motorola 68000@8MHz - Denise Graphics - 512+512KB RAM
Adnoctum is offline   Reply With Quote
Old 15th May 2010, 12:18   #7
Adnoctum
Kill_All_Humans
 
Adnoctum's Avatar
 
Join Date: Apr 2008
Posts: 482
Adnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for presidentAdnoctum should be considered for president
Quote:
Originally Posted by azrael- View Post
Apart from that I can't quite see what kind of impact it'll have on computer systems. <snip> Right now, I'm mostly having a "meh" moment.
I bet the were many people who were going "Meh" when the FPU was being integrated, but where would you be now without one?

I think too many people are looking at this as an integrated graphics core, and not as the stream-processing core it is.
The fact is we still don't really know where we are going with GPGPU and what we can do with it. It isn't all HD encoding and transcoding. I think that the brake on development has been the poor state of integrated graphics (ie. Intel) in 75% of systems.

Imagine every computer, from bottom to top, having a capable GPGPU core? Software developers would be able to count on it being there, just like they can count on a x86 CPU having a FPU.

On a side note: what of Nvidia's GPGPU strategy when every full-fat Opteron has one or more of these cores on die? What of Intel's when such Opterons are spanking the Xeons in database operations (Larrabee to the rescue...)?

I think this is the most exciting CPU development in a long while. The fact that we don't really know what is going to happen is great. It means that there is room for this to change everything, not just a speed bump or a process shrink.

Or maybe it will fall flat on its face?
Adnoctum is offline   Reply With Quote
Old 15th May 2010, 12:21   #8
rickysio
N900 | HJE900
 
Join Date: Jun 2009
Posts: 964
rickysio has yet to learn the way of the Dremel
Intel's current batch of graphics are already on the level of AMD's.

I do wonder whether SandyBridge will launch earlier or not.
rickysio is offline   Reply With Quote
Old 15th May 2010, 12:23   #9
LightningPete
Diagnosis: ARMAII-Holic
 
Join Date: Jul 2009
Location: My Many Houses :)
Posts: 304
LightningPete has yet to learn the way of the Dremel
AMD intergrated graphics solutions usually are better performing chips than Intels. And AMD HD3200 integrated chip versus the X4500 Intel chip for example.
High end part of fusion could see an entry level gaming system?
Would be nice to get entry level systems down in price though. System builders are still charging like 350-500 for basic level systems.
LightningPete is offline   Reply With Quote
Old 15th May 2010, 12:42   #10
Bindibadgi
I Mod, Therefore I Own
 
Join Date: Mar 2001
Posts: 34,814
Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.Bindibadgi is definitely a rep cheat.
HD 3200 versus X4500 wasn't far off, only drivers separated them but in terms of video playback Intel's ClearVideo is fantastic.

As for GMA-HD, imo it's ahead of the latest 880G from AMD overall, so I hope AMD pulls something great out the bag with the Fusion GPU core.
Bindibadgi is offline   Reply With Quote
Old 15th May 2010, 12:55   #11
Pete J
RIP Kidmod
 
Pete J's Avatar
 
Join Date: Sep 2009
Location: Blighty
Posts: 3,866
Pete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming SaiyanPete J is a Super Spamming Saiyan
Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.
__________________
Lian Li PC-P80B / EVGA X79 Dark / i7 4930K @ 4.2GHz (no HT) / Corsair H80i / 4x4GB Avexir Core Blue Series @ 2400MHz 10-12-12-31-2T / 3x SLI EVGA 3GB GTX 780 @1075MHz Core 7000MHz Memory / X-Fi Titanium Professional / 2 x 250GB Samsung 840 / 120GB Vertex 2 / 480GB Vertex 2 / 2TB WD Caviar Green / Silverstone Strider 1500W / Asus PQ321QE / Dell 3007WFP-HC / Headphones TBA / Logitech MK710 and MX3200 / Sidewinder Force Feedback 2 / Datacolor Spyder3 / Win 7 Ultimate x64 / Gigabyte P34G V2
Pete J is offline   Reply With Quote
Old 15th May 2010, 13:03   #12
firex
What's a Dremel?
 
Join Date: May 2010
Posts: 1
firex has yet to learn the way of the Dremel
Why are we talking about the Fusion GPU core as if it's like an integrated graphics core? I'm pretty sure I've read time and again that it will be using the ATI 4000 or 5000 series core...that would definitely beat GMA-HD or any integrated graphics solution for the foreseeable future...

AMD's approach to build X cores on single silicon brings lots of advantages (on paper). However, the fact that the original Phenoms lag behind Core 2 Quads is because of AMD 'reusing' the old K8 architecture...whereas Intel uses a brand new architecture in Core 2. And AMD's very late launch of Barcelona make the performance difference look worse than it actually was.
firex is offline   Reply With Quote
Old 15th May 2010, 13:05   #13
aussiebear
Minimodder
 
Join Date: Nov 2008
Location: Sydney, Australia
Posts: 36
aussiebear has yet to learn the way of the Dremel
Quote:
What do you think? Do you care that all the elements of a CPU are combined in a single piece of silicon or does it not matter as long as the thing works? And are you excited by the prospect of AMD's Fusion CPU?
Well, let's look at AMD's first Fusion processor: Currently codenamed Llano.

From what I know...

(1) It will be aimed for the mainstream. In fact, it replaces the Athlon II line in 2011. Which suggests it will be reasonably affordable by many.

(2) Based on a highly tweaked version of the Phenom II for its CPU part. (32nm process). They've dropped the L3 cache, and upped the L2 cache to 1MB per core. It will start from 3Ghz or higher. And it will be coming in dual, triple, and quad-core versions; operating at 0.8v to 1.3v.

(3) Will introduce power gating (similiar to that of the Core i-series) and other power saving features. I hear the whole processor is rated to have a TDP of 20W to 59W. (Starts at 20W for Notebook versions; while Desktop versions will start at 30W.)

(4) The IGP element of the processor is said to be based on Radeon HD 5xxx consisting of 400 stream processors. So I'm guessing we can expect Radeon HD 55xx to 56xx performance from it. Somewhere around there.

(5) It will require a new motherboard, as the entire northbridge is now on the CPU. The motherboard will only house the "Hudson-D" southbridge.


I'm excited for a number of reasons.

* It sets the first step for an affordable heterogeneous processor that can actually do GPGPU work.

Intel's HD Graphics (found in current Clarkdale CPUs) is really an enhanced X4500 series IGP. It offers very little GPGPU capability. Intel's next generation "Sandy Bridge" uses an enhanced version of the current HD Graphics found in Clarkdale. So again, it has little GPGPU capability; but it will be very good for HD playback role. (As that is what Intel is focusing with their IGPs.)

...And while 2nd generation Larrabee is still being worked on, (as the first generation has missed its window of opportunity); I doubt we'll see an IGP variant until 2yrs+ later.

* This processor would be perfect for OpenCL. (As that doesn't care what type of processor is available; as long as it can be used.)...ATI's Stream SDK for software developers is being improved to support Llano for a reason.

* Its also the first step to gradually reduce the FPU in favour of GPU-like stream processors. AMD's 2nd generation (2015?) will actually combine core elements of GPU into CPU. There won't be any distinct GPU and CPU modules in the future.
=> http://www.xbitlabs.com/news/cpu/dis..._2015_AMD.html

The way the "Bulldozer" architecture is arranged, I'm guessing AMD will eventually replace the K10.5 cores in Llano with "Bulldozer" in the next 2 years.

* While I don't expect Llano to best "Sandy Bridge" (let alone the current Intel Clarkdale processors) in a clock-for-clock manner as its still based on the Phenom II; I do expect that AMD will raise the bar to IGP performance. It means Intel is going to have to up their IGP game...Result? End-users will benefit from improved IGPs! (Game developers will have more room to play with!)

* AMD makes a better attempt at addressing its fundamental issue for the mobile market...Power consumption and resulting battery life.

...While I don't expect it to match Intel's notebook solutions in battery life; I do expect to see a notable improvement over the current AMD based notebook solutions.

* Assuming AMD follows the current pricing trend they have with the Athlon II line; AMD's first Fusion processor will be affordable. It'll be a stepping stone to encourage software developers to start looking into using OpenCL, DirectCompute, etc in a more serious perspective.

And lastly...
* I'm still hanging on to this dinky little single-core 1.6Ghz@2.0Ghz Sempron (Socket 754, 65W).

I want to upgrade it to quad core (at least 3.2Ghz) that is rated at 45W TDP.

I think its possible with 32nm; given that AMD will release Athlon II X4 615e by the end of this year. (That's 2.5Ghz quad-core; rated at 45W TDP.)

Last edited by aussiebear; 15th May 2010 at 13:22.
aussiebear is offline   Reply With Quote
Old 15th May 2010, 13:16   #14
aussiebear
Minimodder
 
Join Date: Nov 2008
Location: Sydney, Australia
Posts: 36
aussiebear has yet to learn the way of the Dremel
Quote:
Originally Posted by Pete J View Post
Why do AMD make such a big deal about everything being on one bit of silicon? IIRC, the first quad cores from Intel were two separate dual cores - and it destroyed anything AMD had to offer.
Because AMD designs processors for the server/supercomputing roles. Integration is especially important where you start scaling up to 4, 8, 16, etc processor sockets.

These features don't mean crap to the typical desktop user because:
(1) They only use 1 CPU socket.
(2) They don't use their computers in an intensive manner such that it requires huge bandwidth.

Intel knows this, so its cheaper/quicker to slap together things and push it into market.

AMD tries to design things elegantly from an engineering standpoint; as they don't have the resources to throw around. (With the K8/K10/K10.5 series; they made one architecture, and then trickled it down to different markets.)

Where as, Intel dumps huge engineering resources/talent and brute forces a solution with the best features they can shove in. (Then they pull them out again to address the affordable/low-end markets.)...Of course, they also have enough resources to accommodate multiple architectures at the same time.

AMD couldn't do this previously. But that looks to change in 2011.
Low end (netbook/nettop) => Bobcat
Mainstream (Desktop/Notebook) => Llano
Enthusiast/Performance/Workstation/Server => Bulldozer
aussiebear is offline   Reply With Quote
Old 15th May 2010, 13:19   #15
StoneyMahoney
Multimodder
 
Join Date: Jul 2009
Location: Stanford-Le-Hope, Essex
Posts: 246
StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!StoneyMahoney - it's over 9000!!!!!!!!1!1!1!!!
Intel's decision to integrate CPU and GPU into the same package cemented the Intel 1-2 combination in the future of every PC sold to a business for the next god-only-knows-how-many years. That's where the real money is - it's how Intel sold the overwhelming majority of all GPUs last decade - and every initiative AMD has come up with to crack into the volume corporate market has only got as far as a brief flirtation.

How much of that is up to performance economics and how much is up to Intel being naughty (and thus pulling in a world record-breaking anti-competition fine in the EU courts) is questionable, but the fact remains that AMD has been hopping along behind Intel for some time now and can do nothing but react to Intel's releases.

Integrating the CPU and GPU does nothing significant to performance, so it's purely a business/economics decision. Until some killer must-have GPU-accelerated business app appears (I'm thinking some kind of real-time business intelligence analysis software?) then GPU performance will continue to be irrelevant to the majority of the PC market. Even when something does turn up, how will AMD take advantage of it's appearance when the difference in performance between Intel's and their own integrated GPUs is so marginal, especially compared to the performance of an add-in card?
StoneyMahoney is offline   Reply With Quote
Old 15th May 2010, 13:39   #16
Arj12
Multimodder
 
Join Date: May 2010
Location: Leicester, England
Posts: 106
Arj12 has yet to learn the way of the Dremel
Well seeing as the CPU and GPU is going to be on a smaller die compared to intel's current ones (well the GPU anyway!) this should mean the chip will be more power efficient and produce less heat =D Can't wait for the release now as I am in the market for a new laptop soon!
Arj12 is offline   Reply With Quote
Old 15th May 2010, 13:55   #17
Autti
Multimodder
 
Join Date: Oct 2009
Posts: 152
Autti has yet to learn the way of the Dremel
Sorry but what is the difference between having the two chips on one piece of silicon, compared to have the two chips on two different pieces but still linked together.
I don't get why this is so big... in fact unless there is an interface boost it is a very bad idea as its more expensive.
failure rate of the gpu creation and cpu creation will now be combined in a single piece of silicon, where as with intel, each parts are independent and hence create higher yields during fabrication,
__________________
Zen and the Art of Modding: an Amateur Foray
2 days since last hospital visit
Autti is offline   Reply With Quote
Old 15th May 2010, 14:02   #18
rickysio
N900 | HJE900
 
Join Date: Jun 2009
Posts: 964
rickysio has yet to learn the way of the Dremel
So I went back to : http://www.bit-tech.net/hardware/cpu...the-next-gen/1

Seems that 2011 will be the Year of Integrated Graphics. :/
rickysio is offline   Reply With Quote
Old 15th May 2010, 15:07   #19
l3v1ck
really joined on Dec 24th 2004.
 
l3v1ck's Avatar
 
Join Date: Apr 2009
Location: The Right Side of the Pennines
Posts: 12,895
l3v1ck has yet to learn the way of the Dremell3v1ck has yet to learn the way of the Dremell3v1ck has yet to learn the way of the Dremell3v1ck has yet to learn the way of the Dremell3v1ck has yet to learn the way of the Dremell3v1ck has yet to learn the way of the Dremel
This has me worried. All AMD want to talk about is the integrated GPU. I thought part of fusion was that they'd be ditching their K8 (or at least K8 circa 2003 derived) architecture, and bringing out a totally new one. The fact that they're not talking a lot about a new architecture makes me think there isn't one. Or if there is, it's not good enough to compete with Intel's Nehalem. Either way it's bad news for AMD and consumers.
__________________
Quote:
Originally Posted by Sifter3000
We swung the banhammer in his little stupid spamming face
The old Dennis Forums (CPC, PC Pro, Mac User etc) - Meeting Place lives on. You're welcome to visit it HERE
l3v1ck is offline   Reply With Quote
Old 15th May 2010, 15:27   #20
javaman
May irritate Eyes
 
javaman's Avatar
 
Join Date: May 2009
Location: Belfast
Posts: 2,531
javaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for presidentjavaman should be considered for president
Im excited about this but Im worried about upgrades. While integrating basic graphics into the CPU is a great idea for lower power usage, Come higher end gaming machines if you want more GPU horse power you have to upgrade the whole processor. I don't feel total integration is the way to go. I also wonder if these will also offer a similar thing to hybrid crossfire.
javaman is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +1. The time now is 13:42.
Powered by: vBulletin Version 3
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.