1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Larrabee more efficient than GPUs at rasterisation

Discussion in 'Article Discussion' started by Guest-16, 30 Mar 2009.

  1. Guest-16

    Guest-16 Guest

  2. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    pics or it didn't happen! :D
     
  3. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,409
    Likes Received:
    1,549
    What about a counter-countersuit? :rolleyes:

    On topic:
    I just can't wait to see how Larrabee performs. Although I think it'll be designed to sell in the mainstream market rather than in the high-end segment. Still... interesting.

    //edit: Just a quick question: Who writes the articles at Venturebeat? 10 year olds? :eyebrow: Or is just aimed at people who have hardly any detailed knowledge of the technology behind graphics cards?
     
    Last edited: 30 Mar 2009
  4. Goty

    Goty Minimodder

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    The biggest hurdle that Larrabee is going to have to overcome is that of drivers. As anyone that's been around long enough to remember the Radeon 8500 can attest to, drivers can make or break a video card, and driver support has never been Intel's forte.
     
  5. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,836
    Likes Received:
    639
    "can be", that's the most important part.

    i don't see how Intel, who is a new player can out-perform nVidia/ATI in their own games.
     
  6. bowman

    bowman Minimodder

    Joined:
    7 Apr 2008
    Posts:
    363
    Likes Received:
    10
    Apparently the low raw performance is due to the frequency of the part being at the performance/watt 'sweet spot'. So those of us who couldn't care less about energy efficiency should potentially be able to overclock it quite a bit.
     
  7. Skiddywinks

    Skiddywinks Minimodder

    Joined:
    10 Aug 2008
    Posts:
    932
    Likes Received:
    8
    It has already been said they won't be. People need to stop assuming Intel plan on usurping ATI and nVidia in one fell swoop.

    Give Intel some credit, what they have seems very interesting, and has a lot of potential.
     
  8. Turbotab

    Turbotab I don't touch type, I tard type

    Joined:
    4 Feb 2009
    Posts:
    1,217
    Likes Received:
    59
    By the time Larrrabee launches, we will have the high-end 40nm DirectX 11 cards pushing out possibly over 4 Teraflops in dual GPU cards. The biggest problem for Intel will be, mainstream 40 nm cards, retailing at the £100 mark, offering well over 1 TFLOPS. Hell, even ATI's current 40nm card the RV740 series, produces nearly a TFLOP, and according to Fudzilla, is only rated at 80 watts. IIRC correctly the TDP for Larry is rumoured to be quite high, so for what market is Intel aiming.
     
  9. Furymouse

    Furymouse Like connect 4 in dagger terms

    Joined:
    4 Feb 2004
    Posts:
    621
    Likes Received:
    22
    Agreed with Wuyanxu. Maybe this is why they are in suits against both AMD and Nvidia. It " can be " the best if we get the others money mired in lawsuits instead of R&D :D
     
  10. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,409
    Likes Received:
    1,549
    The whole point of Larrabee is flexibility and power efficiency, as mentioned before.
    I don't think Intel will fail on the latter. They might not be as effective as ARM as we've learned but they're pretty good at that.
     
  11. nicae

    nicae What's a Dremel?

    Joined:
    25 Nov 2008
    Posts:
    128
    Likes Received:
    0
    We must remember that Larrabee will face an even greater challenge than PhysX. NVIDIA has ~65% of the market and still struggles to make developers adopt PhysX. How many developers will program their rasterisation for x86 Larrabee if it has 0% of the market?

    I'm sure Intel can really push Larrabee by integrating these cards just like they push their horrible IGPs, but how fast can they do that and will they get far enough among the hardcore crowd to convince major developers into going through all the development hassle? Development is already a very costly process and the financial crisis is making many studios close, suggesting they don't have spare cash to spend on such risky projects.
     
  12. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    7,409
    Likes Received:
    1,549
    nVidia has what?! 65% of what market?

    In 2007 it looked like this:

    [​IMG]

    Source: ArsTechnica

    And according to JPR's Q208 report:

    Source: i4u.com
     
  13. Coldon

    Coldon What's a Dremel?

    Joined:
    14 Oct 2006
    Posts:
    208
    Likes Received:
    10
    the discrete gfx card market... intel has 0% in that department...

    If you remove all the crappy intel IGPs that the shipped with the bulk of the office machines out there. I think that a geforce 4 mx is maybe on par with the best IGP intel has.

    The situation at the moment is quite scary with the current intel lawsuits, it almost looks like intel is trying to block the competition from using its technologies while still keeping hold of their tech ( SLI on x58, AMD64 extensions on C2,I7)...

    I'm holding thumbs that intel loses those suits and is made to pay the consequences for their anti-trust actions.

    On the larrabee issue, larrabee might be almost as efficient at rasterization as newer GFX cards, but what will it matter, by the time larrabee launches (assuming a delayed launch) we'll have GFX cards with hardware tessellation units and DX11.

    Nevermind the insane power requirements a 32 core larrabee card will need, estimations of this if intel uses a 45nm process, is around 300w just for the cores, once you include memory and other subsystems, you'll be looking at a 400w power draw. And from all account those cores will be kept pretty busy even for basic tasks.

    Personally I'm holding thumbs for larrabee to be a epic failure but then Intel will just strong arm OEMs into supplying the cards to their customers, just like with their atrocious IGP chipsets, that 45% of people are using today!
     
  14. nicae

    nicae What's a Dremel?

    Joined:
    25 Nov 2008
    Posts:
    128
    Likes Received:
    0
    I was speaking of discrete graphics, where Intel has no presence and S3 can be disconsidered.

    Therefore, we're talking about a market of 31.4% + 18.1% = 49.5%.
    As such, NVIDIA has 31.4% / 49.5% = 63.4343434343434% of that market, with the rest left to ATI. My memory of Valve's hardware surveys ain't that bad, eh? =)

    But that's still approximate as we would also need to split NV and ATI IGPs, but that won't change my point.
     
  15. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    Just a thought I had (sorry if this has been asked elsewhere)....

    As Larrabee is based on x86 architecture and instructions, and a part of the agreement with AMD is to share x86 and developments from it (e.g. x86-64, otherwise known as AMD64), doesn't this mean that AMD could make their own Larrabee compatible part if this whole architecture takes off?

    And if it does, where does this leave nVidia, who doesn't have an x86 licence?
     
  16. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    Double post, f*#@ing wireless.
     
    Last edited: 31 Mar 2009
  17. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,836
    Likes Received:
    639
    x86 is not as good as it sounds, it's got a lot of unused instructions since the days of i386. it's power hungry and inefficient comparing to RISC processors.

    although they say it's going to be x86 compatible, i highly doubt that, it's probably just part of the x86 instruction set, just enough for ray-trace and current generation.
     
  18. nicae

    nicae What's a Dremel?

    Joined:
    25 Nov 2008
    Posts:
    128
    Likes Received:
    0
    The question is: Will they want to make x86 graphics cards? It's a very risky change and there's a lot of market inertia that should hold back the architecture from industry-wide adoption. If it were a two-man race (out of the three men NV, AMD and Intel) or if x86 was guaranteed to be better, than they would be scrambling to get it done as well. Yet, if only Intel tries the push, AMD and NV could remain comfortably earning cash on current rasterisation cards and leave the costs of change to Intel. Once it does change the industry (if it does), they can jump in. With a bit of delay, for sure, but a tiny fraction of the costs and almost no risk. Sounds like a more reasonable plan considering you have a multibillion-dollar company at stake.

    Assuming x86 turns out to be better, yes, AMD could hop in, as well as VIA, I'd guess. NVIDIA would be left out as they kind of already are, justifying their seeking of x86 with VIA and internal developments.
     
  19. DXR_13KE

    DXR_13KE BananaModder

    Joined:
    14 Sep 2005
    Posts:
    9,139
    Likes Received:
    382
    i feel that this graphic card will be utter epic fail...
     
  20. gagaga

    gagaga Minimodder

    Joined:
    14 Dec 2008
    Posts:
    193
    Likes Received:
    10
    I disagree - that's nearly 10w a core. You can run a full Core 2 with cache at that level etc at that power level. Remember these are dumb simple cores with in-order execution and a lot of the power hungry bits missing, just like Atom. I'm guessing they'll be looking at around 2w a core giving a total near to 80-90w all up. Still a lot, but it depends on what the load is like and what amount of useful work you're getting out of the other end.
     
Tags: Add Tags

Share This Page