1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia CEO reveals Fermi architecture

Discussion in 'Article Discussion' started by Tim S, 30 Sep 2009.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
  2. Isitari

    Isitari Active Member

    Joined:
    6 May 2009
    Posts:
    353
    Likes Received:
    26
    hhmmm Games are definitely not going to need THIS much power. The ATI 5870 is already well overpowered for anything now, that's why AMD are pushing using Many multiple displays as this is the only thing that actually pushes the hardware!

    Isitari.
     
  3. glaeken

    glaeken Freeeeeeeze! I'm a cawp!

    Joined:
    1 Jan 2005
    Posts:
    2,041
    Likes Received:
    50
    There really isn't any thing as too much power. This much power is great for other areas besides games: offline computing, scientific visualization, etc.
     
  4. Threefiguremini

    Threefiguremini New Member

    Joined:
    13 Sep 2009
    Posts:
    521
    Likes Received:
    19
    How do people ever buy hardware for their pc?? Every time I come close to buying a part there's a new one just over the horizon!
     
  5. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    It seems strange to position yourself in the market of co-processors that do double duty as graphics cards. It's expensive and there are hundreds of millions of transistors that will inevitably be useless to the majority of their market...

    Do they really think that selling millions of co-processors to people will keep their company afloat against more dedicated solutions? It seems a bit hard to imagine that this will work out for them. Oh well, its an interesting thing to see. :)
     
  6. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    One of the demos they've shown is speeding up early tumour detection for breast cancer... going from approx 9 hours to less than 30 mins. Nvidia's moving beyond just graphics and has been for some time... it looks like this time they've not forgotten about graphics like they seemed to with GT200.
     
  7. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    This is looking good particular for CUDA programming close to metal. This is definitely geared more towards compute than gaming which is not necessarily a bad thing.

    Folding anyone???
     
  8. Diosjenin

    Diosjenin Thinker, Tweaker, Et Cetera

    Joined:
    14 Jul 2008
    Posts:
    777
    Likes Received:
    54
    Anand put up a relatively in-depth look at the new architecture earlier today, including why they're emphasizing compute. Interesting stuff.
     
  9. Saivert

    Saivert New Member

    Joined:
    26 Mar 2005
    Posts:
    390
    Likes Received:
    1
    lol, it's a long time ago since NVIDIA was purely a graphics company. They are into CPUs (ARM), GPUs, CUDA, and all that now.
    This way they can have a foot in more markets and that will make them stay afloat.

    ATI can ride the cheap graphics bandwagon as long as they want of course (we like cheap graphics).
     
  10. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    The sad thing is that they aren't focusing on Graphics which is pretty much the focus of the GPU industry.
     
  11. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,945
    Likes Received:
    17
    Ever since the 3Dfx Voodoo, GPU development has always been about making games run faster with better graphics.
    Reading this, this is the first time I've felt that's not true any more.
    I think Nvidia can see future profits coming from large business rather than PC gaming.
     
  12. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Makes sense, although I want GPUs to be like the 6800GT ---> 7800GTX. Or the 7800GTX ---> 8800GTX.

    I mean the evolution was incredible. Now it's still a bit dissapointing. I mean the 8800GTS 640/320MB was almost 2x the performance of the previous generation, and remember this was a midrange card.
     
  13. DiegoAAC

    DiegoAAC New Member

    Joined:
    27 Nov 2007
    Posts:
    20
    Likes Received:
    0
    I'm guessing that Fermi will be 10-20% faster than Evergreen, but nVidia will make ~30% less money per gpu than AMD (2x10^9 vs. 3x10^9 transistors at 40 nm TSMC). Also that the capability to execute code written in c++ is through LLVM.
     
  14. Itbay

    Itbay New Member

    Joined:
    8 Aug 2009
    Posts:
    12
    Likes Received:
    0
    Confused..

    When you will decide to buy a device the next day you will find another better than that...just confusing..:confused:
     
  15. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    huang said he's going to start offering suki suki with each nvidia card.. this reminds me so much of when the g80 was released- everyone was like holy cow waiting for ati's response

    ati and that guy at the inquirer were hyping up ati's phantom card for the longest time.. it released and everyone who was waiting like myself was like wtf- it was just a bunch chicken ****

    I'm not holding my breath.. real time ray tracing.. smoke up my ass
     
  16. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    I always take Charlie's nVidia predictions with a grain of salt, but I'm starting to think he may be right about a Q1 2010 launch for "GT300" this time.
    The rumours floating around is that the GT300 is floating around developers and so on. To quote Johnny Rotten: "Horses, Horses, Horses**t!"
    Do we all seriously think that Jen-Hsun "Whoop-Ass" Huang would present Fermi to the public and NOT show it in action? Especially when he is attempting to dull the Evergreen launch?
    And no, I don't count an image of it supposedly ray-tracing a Bugatti (are we to take nVidia's word on this?) and holding a black card in his hand that does nothing.
    The problem I have at the moment is that I don't trust nVidia even when they may be telling me the truth.
    Too much FUD, not enough action.
     
  17. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
  18. mclean007

    mclean007 Officious Bystander

    Joined:
    22 May 2003
    Posts:
    2,035
    Likes Received:
    15
    I don't think you can put much stock in those figures. First, a very large portion of the unit cost of a GPU, especially in its early production phases, must relate to amortisation of research and development costs.

    Secondly, I would expect the cost to produce a 3bn transistor chip will be significantly more than 50% more than the cost to produce a 2bn transistor chip - if yields were perfect, that might be the case, but remember that even one dud transistor can spoil a chip entirely, and the likelihood of a 3bn transistor chip having at least one dud is substantially greater than for a 2bn transistor chip.

    Finally, even if your assumptions on cost of production were right, that is simply cost of production. It has to be assumed that nVidia will charge a premium for its GPU if it is faster than AMD's offering, so revenue per unit will be higher. This higher price would be expected to have a knock on effect on volumes.

    There are too many unknowns in this for you or me to make any kind of estimate as to which architecture will be more profitable.
     
  19. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    Screw crap x86 CPU garbage from AMD and Intel.

    Fermi and Cypress are beautiful works of art and mark the start of the GPU age. Kudos to ATI and Nvidia for keeping well ahead of Moores Law and giving developers and programmers new toys to play with!
     
  20. general22

    general22 New Member

    Joined:
    26 Dec 2008
    Posts:
    190
    Likes Received:
    1
    Well they seemed to have slightly more than doubled the compute power so for games so it should end up reasonably faster than the 5870. Too bad its still a little while off. Hopefully NVIDIA aren't being completely stupid and abandoning performance in 3D Graphics since that is basically 99% of their revenue.

    I also read somewhere that the demo fermi card shown was just a mock up so I'm guessing a release will not happen this year.
     
Tags: Add Tags

Share This Page