1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware IDF Day 3 - 32nm Westmere Performance

Discussion in 'Article Discussion' started by Tim S, 25 Sep 2009.

  1. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
  2. Yoy0YO

    Yoy0YO Lurky Lurker

    Joined:
    20 Mar 2009
    Posts:
    100
    Likes Received:
    4
    This definitely looks good for a SG05 super-portable LAN box build with DDR3 and 32nm. Thanks again Bit-Tech for the performance review
     
  3. Darkraven

    Darkraven New Member

    Joined:
    23 Dec 2006
    Posts:
    120
    Likes Received:
    2
    Wait-wait-wait, and BAM, your head gets dizzy. So much for napping till Sandybridge. Going to get even more interesting by Christmas I'll bet.
     
  4. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,229
    Likes Received:
    144
    3.33GHz against that competition? Of course it can keep up! :eyebrow:

    Still very nice to have it run at 70W max. But I think I'll wait for it to hit retail before I make a decision about whether I like it or not.
     
  5. Jack_Pepsi

    Jack_Pepsi Clan BeeR Founder

    Joined:
    24 Apr 2006
    Posts:
    646
    Likes Received:
    11
    Ymmm... just imagine a low powered 32nm quad based set up. Now that is something I'd be interested in purchasing.
     
  6. [PUNK] crompers

    [PUNK] crompers Dremedial

    Joined:
    20 May 2008
    Posts:
    2,909
    Likes Received:
    50
    you'd think they might do that, stick two of the dual cores together again hyperthreading enabled too would mean 8 cores. would be a great chip.

    the integrated gpu seems like a good idea if it can handle multi-media at low power. the chances of these being an option for gamers is fairly slim though i should imagine
     
  7. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    hmm alrighty this I think is what I will go for when I go to build my carputer, its coming out around the right time too as I wanted to build this as a xmas gift to my car and myself =p
     
  8. Andy Mc

    Andy Mc Well-Known Member

    Joined:
    23 May 2002
    Posts:
    1,722
    Likes Received:
    129
    I'm pretty sure thats not the first x86 chip to have done that Cyrix beat Intel to this back in 1996 with their MediaGX range of chips. I remember the local PC builder I worked for sold a load of cheap systems based on the MediaGXm chip.
     
  9. hiei-warrior

    hiei-warrior New Member

    Joined:
    26 Sep 2009
    Posts:
    7
    Likes Received:
    0
    i wonder why it's always when i upgrade that intel releases a bunch of lower nm cpus :(
    i think the next time i upgrade im gonna get a 22nm cpu :)
     
  10. TSR2

    TSR2 New Member

    Joined:
    19 Aug 2009
    Posts:
    160
    Likes Received:
    4
    I don't really see the point of the integrated GPU, its just another unupgradable feature that adds cost, although fortunately you say Intel has made it able to be turned off so it won't use power.
     
  11. aussiebear

    aussiebear New Member

    Joined:
    13 Nov 2008
    Posts:
    36
    Likes Received:
    8
    Because the future direction of computing is a heterogeneous processor that combines various other processors in one package. There's a reason why software frameworks like DirectCompute and OpenCL exist. (OpenCL allows you to grab whatever processor you have, CPU, GPU, etc and use them in a unified manner. In later versions, it'll combine with OpenGL, so you'll be able to dynamically switch the GPU from graphics to physics or other GPU roles.)

    Clarkdale's IGP is basically an evolutionary improvement over the X4500HD.
    (Increased performance, and some HD playback features...Still no match for Nvidia or ATI's IGPs in gaming performance.)

    The next one, Sandy Bridge (2011?) also uses the same IGP technology, and improved again.
    (Some say this is when AMD will release Fusion...Its a big question mark whether they can achieve the rumoured 2011 date.)

    The one after, Haswell (2012?) uses technology from their Larrabee project.
    (Larrabee technology is, in layman's, just using lots highly vectorised x86 processors for GPU and GPGPU roles. It won't mean much to the end user. Its more interesting for programmers though, due to its flexibility).

    What makes Intel's eventual approach (2012) interesting, is that their design will be completely x86. Not sure about AMD's approach, as they've revealed very little.
     
  12. Chocobollz

    Chocobollz New Member

    Joined:
    25 Dec 2008
    Posts:
    122
    Likes Received:
    0
    Though the cost would be so negligible you'd think that Intel has given it to you for free! (I think :p)

    And then in the next few days after you got your upgrade, you gonna ends up here again saying the same thing because Intel just released a 16nm CPU :p
     
  13. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Richard Huddy, AMD's head of worldwide developer relations, effectively said to me last year that "it doesn't make sense" to implement x86 on a GPU since virtually everything is API-based. For once, he agreed with David Kirk (Nvidia CTO), who told me that x86 "was a waste of die space" on a GPU.

    Of course, GPU compute is something that can be API based, but it doesn't have to be if there's a suitable compiler available (like the C compiler Nvidia uses with C for CUDA) - that's an area where x86 could be useful on a GPU (hell, it's why Intel thinks that is the right direction). Time will tell, I guess, and we'll have to see what happens when Larrabee eventually makes it out of the door.
     
  14. tqiw

    tqiw New Member

    Joined:
    29 Sep 2009
    Posts:
    1
    Likes Received:
    0
    Which case is pictured in the article?
     
Tags: Add Tags

Share This Page