1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Larrabee die size is massive

Discussion in 'Article Discussion' started by Guest-16, 13 Apr 2009.

  1. Guest-16

    Guest-16 Guest

    I agree, but Larrabee will simply be a GPGPU monster and Intel will rack in tons of money in the server space where it's already strong and business' will pay through the nose. I doubt it cares too much about the PC gaming market until it gets in the PS4 (my prediction) where it can levy some game publishers with Sony's help.
     
  2. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    lol it's not fanboyism.. just wait and see- well I am kinda a intel fanboy.. but if you define fanboy as liking the best ocing chips then- that's what I am!
     
  3. Panos

    Panos Minimodder

    Joined:
    18 Oct 2006
    Posts:
    288
    Likes Received:
    6
    Up to now Intel is all smoke and mirrors. Heh, I bet it will be couple of generations old, compared with the current market when it comes out.

    Nvidia and ATI are fine tunning the current technologies, and adding new stuff.

    While Intel, has to design it, produce it, put it on sale and then they have to start trying to get in part with the competition.
    If they trying new technologies and tunning, at the scale ATI and Nvidia do, they will never put the product on the market.

    And if it's like P4, with my blessings, the fun boys should downgrade their monitors from now.
     
  4. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    transistor count is through the roof though.. going to be something to contend with for sure- the driver thing only matters in that nvidia and ati have a huge headstart in optimizing specific titles.. even if larry releases without any optimizations..

    it should be able to handle 1080p gaming- if anything nvidia has been sandbagging for years now.. I could really care less what they do nowdays because it's all sand.. maybe larry will be the kick in the ass to get gpu's beyond what we have today and engines like crysis the norm
     
  5. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,552
    Likes Received:
    1,791
    Play nice, kids. Wait for the first tests and then start bashing/praising Larrabee please. There is no way anybody can really predict how it will perform.
     
    Krikkit likes this.
  6. iwod

    iwod What's a Dremel?

    Joined:
    22 Jul 2007
    Posts:
    86
    Likes Received:
    0
    Not much of a problem. Since CPU aren't selling well and Intel have lots of Wafer space left. An huge sized Larrabee is going to be good for them. To test and iron out all major software bugs as well as to test market response.
     
  7. Krikkit

    Krikkit All glory to the hypnotoad! Super Moderator

    Joined:
    21 Jan 2003
    Posts:
    23,926
    Likes Received:
    655
    Quite.
     
  8. Turbotab

    Turbotab I don't touch type, I tard type

    Joined:
    4 Feb 2009
    Posts:
    1,217
    Likes Received:
    59
    What a bunch of highly intelligent, well researched posts!, thank God for the Bit-Tech regulars and Bindi.
    To the Hippo, if you look at the increase in transistors from Nvidia's G80 to G200 core, you will see a 100% increase. Therefore the GT300 will probably end well in excess of 2 Bn transistors, not to mention significantly higher number of (shader) cores, no way will Larrabee outperform it in terms of pure computing power. As Bindi stated the Larrabee's game winner is its native X86 cores, perfect to run all those GPCPU and C / C+ etc business apps, that the Cloud computing movement will hoover-up.
    If the PS4 does go with Intel, I hope it does not result in a hot, noisy beast, like the XBOX 360.
     
  9. TreeDude

    TreeDude What's a Dremel?

    Joined:
    12 May 2007
    Posts:
    270
    Likes Received:
    0
    The different design of Larrabee is going to cause all kinds of issues out the door for gamers. It will take Intel some time to nail them all down. Drivers will be updated, games will be patched, it will be a mess at first.

    However, fanboy or not, we should all be excited. More competition means lower prices (at least until one of them go under).
     
  10. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    GT300 is not going to be anything great when you look at price/performance.. they would rather keep technology stagnant to milk money.. I'm not saying larrabee will be the nvidia killer.. but I do think it will have a big impact on what goes into dev from that point on and it will hurt nvidia's bottom line.. I'm not so concerned with ati because they have amd to work with

    I'm just sick of nvidia's marketing.. they love to say how much better they are than everyone else.. when you look at what they offer compared to ati- it's just rehashed with some clock plays.. I think this will hurt thier bottom line- far as gaming goes, how can you say it can't compete with what's out! that's fanboyism at it's finest when you talk trash on things that haven't even been tested! I'm basing my opinions on what I see out and what I've been following for the past couple of years since the G80.. I've owned these cards! :D I just want something better- GT300 in excess of 2billion is yet to be seen also
     
  11. Evildead666

    Evildead666 What's a Dremel?

    Joined:
    27 May 2004
    Posts:
    340
    Likes Received:
    4
    The idea is that Larabee will be able to emulate DX and OpenGL and OpenCL, if not run some of them natively.
    There will be no issue for gamers, as this will either work or not.
    To be compliant, they will have to have the whole feature set compatible, not just parts of it.
    So DX will work or it will not, there will not be any specific problems with certain games.
    optimisations, maybe, but not outright bug corrections, unless the problem appears on all the graphics adapters.

    the first Larabee's or Larabi, will be for the business end of the stick, since they probably won't be good enough to run consumer level games at an acceptable speed.
    GPGPU stuff will probably run like nuts on larabee tho, right from the beginning....

    When the die shrink appears in 2010, the speeds/Number of cores, will be sufficient for it to enter the consumer market, or not.
     
  12. Evildead666

    Evildead666 What's a Dremel?

    Joined:
    27 May 2004
    Posts:
    340
    Likes Received:
    4
    Intel is also trying to pull the graphics market back to the cpu, since in X years we will have 16/32/64 x86 cores on our cpu's, and they will replace the gpu entirely.

    Making the x86 a future standard for GPGPU or just GPU would be hard for Nvidia, and OK for AMD/ATi...
     
  13. n3mo

    n3mo What's a Dremel?

    Joined:
    15 Oct 2007
    Posts:
    184
    Likes Received:
    1
    I see it as more of an Itanium of GPU world. Great concept, fantastic on paper but fairly useless in reality. I don't see nVidia and AMD afraid of that, especially given Intel's history of crappy GPUs.

    Theoretically Larrabee could grow and mature to be useful, but I don't think that they have the time to do it. We're getting close to the end of what the silicone-based technology can offer, multiplying cores doesn't give expected advantages (memory access times and latency suffers greatly, not to mention heat output etc.)

    x86 is too old, slow and limited to be used for GPU or GPGPU. While introducing high performance double-precision computing would mean a revolution, this is not going to happen.
     
  14. Faulk_Wulf

    Faulk_Wulf Internet Addict

    Joined:
    28 Mar 2006
    Posts:
    402
    Likes Received:
    6
    A question i always wondered but never asked.
    Learning +1. :clap:
     
  15. willyolio

    willyolio What's a Dremel?

    Joined:
    17 Feb 2007
    Posts:
    205
    Likes Received:
    11
    when you're as big as intel, you can always resort to brute-forcing your way into the market.
     
  16. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    Haven't seen that little dies on a wafer since 200mm ;-)
     
  17. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    To me it is just sounds like the "Matrox going 3D" situation again, Excellent 2D imaging but stepping into the 3D bandwagon far to late. A lot of hype and in the end it was a total waste...
     
Tags: Add Tags

Share This Page