1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware IDF Day 1 - Sean Maloney demos Larrabee

Discussion in 'Article Discussion' started by Claave, 23 Sep 2009.

  1. Claave

    Claave You Rebel scum

    Joined:
    29 Nov 2008
    Posts:
    691
    Likes Received:
    12
  2. eek

    eek CAMRA ***.

    Joined:
    23 Jan 2002
    Posts:
    1,600
    Likes Received:
    14
    Hmm, doesn't look like it's shaping up to be a card for the enthusiasts.
    Once combined cpu/larrabee chips it'll hopefully spell the end for completely useless gfx on netbooks - after all intels pricing policy makes ion pretty prohibitive!!

     
  3. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    come to think of it.. larrabee sounds like something my dad would run :D

    this reminds me alot of the volari line of cards that came out few years back.. they were so weak what was the point
     
  4. mrb_no1

    mrb_no1 Pie Eater

    Joined:
    15 Sep 2007
    Posts:
    394
    Likes Received:
    1
    six cores, hmmm, i still dont get full use out of my four! but then i'm just a lonely gamer, what might professionals use to get proper use?!

    i suppose intel cant win every time, that demo must have really sucked!

    peace

    f.
     
  5. Dave Lister

    Dave Lister Minimodder

    Joined:
    1 Sep 2009
    Posts:
    880
    Likes Received:
    12
    Does anyone know if there are any videos of that demo online ? or the green nvidia car ? i'd be interested to see how real time raytracing looks.
     
  6. Emon

    Emon What's a Dremel?

    Joined:
    14 Jun 2004
    Posts:
    680
    Likes Received:
    0
    "I still don't get Intel's obsession with ray tracing"

    Because ray tracing or similar techniques are the "right way" to do things. Raster graphics are a huge collection of hacks. To my knowledge, any decent solution for radiosity, reflection and refraction with raster is either a huge hack or starts approaching tracing techniques. You get all that for free with ray tracing.

    Also, ray tracing requires no special-purpose hardware. It's extremely easily to parallelize. Intel are the guys that want to make 80 core general purpose CPUs, remember? Massively parallel, general purpose is within Intel's grasp (5-10 years?) so it makes sense.
     
  7. GigaMan

    GigaMan GIGABYTE UK

    Joined:
    30 Oct 2008
    Posts:
    25
    Likes Received:
    1
    Oh boy... poor Intel
    Very disappointing indeed :-(
    I was hoping Intel would WOW everyone.

    Maybe, just maybe things will change right at the end - hehe
     
  8. azrael-

    azrael- I'm special...

    Joined:
    18 May 2008
    Posts:
    3,852
    Likes Received:
    124
    There's a video of the demo here: http://news.cnet.com/8301-13924_3-10359276-64.html

    For what it's worth I'm somewhat impressed considering it's realtime raytracing, which is a fundamentally different (and computationally intensive) approach to 3D rendering than the rasterization approach we're used to. Have to admit that I'd like to see how Larrabee would perform using that approach.
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    It might be "the right way" to do things, but please tell me a game shipping today which uses ray tracing. For a product shipping in the middle of 2010, ray tracing is a pointless demo... in fact I'd go so far as say it's "the wrong way" to show Larrabee off. Please show me, and other fellow gamers, how it plays the games I play today.

    Secondly, when it runs as slowly as it did in the demo, it does nothing but prove that a chip supposedly designed for ray tracing can't even render a basic scene (that looks dated) in "the right way"... again, that smells of "the wrong way" to me again. Raster graphics may be a huge collection of hacks, but it is what games use today - the benefit it has is that performance is uncompromised and that's why developers use it. Performance is hugely compromised with a completely ray traced environment, as proven by Intel's demo - the water looks much more dated and so do the ripples. Not even film animators use ray tracing exclusively and they measure performance in frames per day, not frames per second.

    I'm not against ray tracing, but I don't see the obsession with ray tracing every part of a scene - it's hugely expensive and there is, frankly, no benefit. There are benefits to ray tracing curved surfaces, transparent surfaces and much more, but compute power needs to increase by a factor of thousands before ray tracing is viable in a real-time environment.
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Intel has been demoing real-time ray tracing for years at higher and higher resolutions, running on the CPU only in the past. The fact that a 6-core, 12-thread Gulftown, PLUS Larrabee, ran the demo worse than an 8-core, 16-thread box with ATI/Nvidia graphics last year (60fps at 720p, fwiw) is very, very disappointing. Of course, Intel won't disclose the resolution it was running at - I'd guess 1080p - but it wasn't even running at anything close to 30fps and with a completely static camera unlike previous demos.
     
  11. Cupboard

    Cupboard I'm not a modder.

    Joined:
    30 Jan 2007
    Posts:
    2,148
    Likes Received:
    30
    So are you saying that Larrabee is running slower that the 8 core 16 thread processor of last year, or was the ray tracing running on the graphics card? Either way that really sucks.
     
  12. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I believe the demo is running at a higher resolution, but I don't know what resolution it is running at (Intel won't say), but it wasn't running at 30fps, never mind the 60fps that we saw last year. I'm guessing it was 1080p based on the size of the screen relative to other stuff around it, which is effectively 2.25x the resolution of 720p (0.92mp vs 2.07mp)... core count (on the CPU) reduced by 25 per cent, but they've added Larrabee on top... which has many more 'cores' - it would have been an impressive ray tracing demo if it hit 60fps at 1080p with early Larrabee silicon, but it didn't. The fact that it didn't with a static camera just compounded that disappointment.

    As a result, you've not only got the let-down of not finding out how Larrabee is going handle Direct3D applications (the big question mark for most of us, I'm sure... even if it's just smooth vs. not smooth), but you've also got the compounded disappointment of the demo looking dated, being a static camera angle and not running particularly well (which isn't a great proof of concept for real-time ray tracing being 'just around the corner'). The water seriously doesn't look that much better than the water in Far Cry - a game released on DX9 in 2004. :)
     
  13. Dave Lister

    Dave Lister Minimodder

    Joined:
    1 Sep 2009
    Posts:
    880
    Likes Received:
    12
    Thanks for the link !
     
  14. [PUNK] crompers

    [PUNK] crompers Dremedial

    Joined:
    20 May 2008
    Posts:
    2,909
    Likes Received:
    50
    flop alert!
     
  15. mute1

    mute1 What's a Dremel?

    Joined:
    16 Sep 2009
    Posts:
    124
    Likes Received:
    2
    To me, your comments on ray-tracing are spot on, Tim. It may in a sense become more efficient than rasterisation at higher resolutions, but when will we be playing games at such resolutions and have the hardware to do so? No time soon, for certain. When ray-tracing is introduced for games, it will be in tandem with rasterisation without doubt.
    I really don't see how Larrabee can succeed. With respect to games, I know that Tim Sweeney has said that he sees games being rendered through software in the future but even then, Larrabee would surely not be the hardware to do it. If he was to commit himself to designing a games engine along these lines at this stage, I think it would be a waste. I don't see why he would anyway because surely it would not be able to be used for consoles (this or the next generation), even if there are reports that the Playstation 4 will use Larrabee for its graphics.
    With respect to general computational work, surely GPUs are more efficient in terms of speed and power anyway, whether or not they are harder to code for.
    No matter what it is used for, Larrabee will be too big and hot to be worthwhile, or so it would seem.
    (Hope you're feeling better too).
     
  16. chicorasia

    chicorasia What's a Dremel?

    Joined:
    8 Jan 2008
    Posts:
    84
    Likes Received:
    0
    Intel appears to be threading the path of disruptive innovation with Larrabee - a path where lower performance is expected at the early stages of the technology but - hopefully - incremental development will allow the new tech to catch up with the average needs of the GPU market.

    However, if initial performance is not even capable of appealing to the lower end of the market, and no developers adopt ray tracing (and I aggre with Tim - it is not about the "right" way to do things, but rather the appropriate way to do things, the way that gets the job done), I cannot see much future in Larrabee.

    Intel is certainly not dumb. Larrabee is a huge gamble that may pay off in the long run. But "underwhelming" is certainly not the word we were all expecting after an year and a half of hype around this new architecture.
     
  17. DeXtmL

    DeXtmL What's a Dremel?

    Joined:
    7 Sep 2007
    Posts:
    50
    Likes Received:
    0
    I see many ppl are discussing about the ray-tracing.
    yes, it INDEED is one fun point of Larrabee, but not all.
    It's just a derivation of Larrabee's software rendering revolution.

    Larrabee will be the first gfx card that's built upon x86 architect. The vast differences between Larrabee and traditional GPUs are programbility.
    Larrabee enables us to code the whole render pipeline almost entirely in software, thus dramatically increases the complexity of output images while maintain reasonable performance.

    Sadly, intel tries its best to cover all tech details.
    So, its actual performance on next-gen games is unknown.
    Talking about ray-tracing, surly itl be slower than rasterization method if only local direct lighting is taken into concern.


    Dex
     
  18. Fod

    Fod what is the cheesecake?

    Joined:
    26 Aug 2004
    Posts:
    5,802
    Likes Received:
    133
    That presentation was a load of maloney! (rimshot)
     
  19. dec

    dec [blank space]

    Joined:
    10 Jan 2009
    Posts:
    323
    Likes Received:
    12
    wow im surprised. i was thinking larrabee would be demoing something like the demo ASUS had for the Mars. it seems like intel is putting all their eggs into the ray tracing basket. kinda risky move
     
  20. Nikumba

    Nikumba Minimodder

    Joined:
    29 Aug 2001
    Posts:
    658
    Likes Received:
    11
    Ok so the demo was a bit poor, however it is nice to see they are trying the new thing. They may have accepted they can not compete easily in the D3D arena with nVidia and AMD not really trying anything new, other than making the chips faster and with more memory, the cards are still essentially doing the same thing they have done for years.

    Raytracing will happen at some point, and I can see when Intel crack it and if they release good tools for it, then nVidia/AMD might be left stuck, however since Labree is a paraelle processing card, the theroy would be once Intel have it all working, they should be able to replease an API that Cuda or whatever the AMD version is called, to run ray tracing.
     
Tags: Add Tags

Share This Page