1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Larrabee will feature rasterisation units

Discussion in 'Article Discussion' started by CardJoe, 24 Apr 2008.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,343
    Likes Received:
    292
  2. zero0ne

    zero0ne Member

    Joined:
    19 Jul 2004
    Posts:
    117
    Likes Received:
    0
    who cares? I'm confident that Nvidia will crush their hardware!

    They will then begin to produce GPUs that can also act as your computers main CPU via a OS written in CUDA!!!!

    haha
     
  3. bowman

    bowman Member

    Joined:
    7 Apr 2008
    Posts:
    363
    Likes Received:
    10
    Haha. NvidiOS. I like that.

    Anyways, more competition is good. Nvidia should buy VIA for the x86 license. That way we'll have three CPU/GPU manufacturers.
     
  4. [USRF]Obiwan

    [USRF]Obiwan New Member

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    If Intel manage to make a GPU that is 2x faster then the current top ranked GPU. That would be incredible. anything less then that and its just another wannabe GPU maker...
     
  5. chicorasia

    chicorasia New Member

    Joined:
    8 Jan 2008
    Posts:
    84
    Likes Received:
    0
    They should start by making integrated graphics that doesn't suck as bad as current offerings do. And they must do it NOW.

    It's all about branding and developing a solid user/fan base.

    I'm sorry, Tim - the promised tenfold increase is just not enough. Unles you mean to compete with Geforce 7-series cards, two years from now!
     
  6. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    10-fold was for integrated graphics :)

    I think Larrabee will be quite a bit faster than the current generation hardware from Nvidia/ATI (at least theoretically - there's more to it than that though as you know).
     
  7. Nikumba

    Nikumba Member

    Joined:
    29 Aug 2001
    Posts:
    647
    Likes Received:
    11
    To be honest the Intel intergrated gfx has mostly been for laptops to do office/internet etc why should they make it more powerful? If you want to play games, buy a more expenisive laptop with a better gfx card.

    Of course it would be reeally good if the gfx card from Intel has the ability to upgrade the GPU/Memory like we do with a normal pc
     
  8. MrMonroe

    MrMonroe New Member

    Joined:
    27 Dec 2007
    Posts:
    195
    Likes Received:
    0
    I just can't see how this is going to pan out for Intel. They have no experience in this industry and they are going to try breaking into it by using a combination of untested tech and tech they aren't any good at. And what's to stop nVidia or ATI from pushing in a little ray-tracing for certain effects?
     
  9. TreeDude

    TreeDude New Member

    Joined:
    12 May 2007
    Posts:
    270
    Likes Received:
    0
    I have no doubt Intel will have awesome hardware specs wise. I think we all know what is going to make or break this new line. Drivers.

    Personally I think both Nvidia and ATI have terrible driver quality. ATI has much more structure to their releases, but CCC is bloated and unnecessary. Nvidia is sporadic. One driver is great, then one gets released because of a new game and it kills the performance on your other games. It is hit and miss with their drivers.

    If Intel puts out a decent set of drivers I will be very surprised. I say it takes them at least a year before their drivers come even close to ATI/Nvidias.
     
  10. D3s3rt_F0x

    D3s3rt_F0x New Member

    Joined:
    28 Oct 2004
    Posts:
    719
    Likes Received:
    6
    It's all trash talk, from intel and those saying it'll be a great product wait and see its the first time intel have dipped there toes in this area and for all you know it could be a shambles, which tbh I could possibly see it being, but give them more time and you never know.

    Im not going to take anything they say in untill i see people with the chips running programs people run and benchmarking them for true figures with just a pinch of salt.
     
Tags: Add Tags

Share This Page