1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Rumour: Nvidia GT300 architecture revealed

Discussion in 'Article Discussion' started by Sifter3000, 23 Apr 2009.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
  2. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,562
    Likes Received:
    225
    interesting, but MIMD is a lot more complex to design, so unless they get this perfect, they'll suffer another Fx series.
     
  3. Flibblebot

    Flibblebot Smile with me

    Joined:
    19 Apr 2005
    Posts:
    4,619
    Likes Received:
    140
    Absolutely: parallelism is very hard to code for, especially if you come from a non-parallel background. Unless nVidia have gone out and hired a bunch of parallel knowledgeable coders, it may take some time for the drivers to become useable to any degree.

    Also, how is the parallelism going to be handled by DirectX?


    (Forum link missing in article, btw)
     
  4. Turbotab

    Turbotab I don't touch type, I tard type

    Joined:
    4 Feb 2009
    Posts:
    1,217
    Likes Received:
    59
    It sounds exciting, big and expensive, but exciting. I only hope, that game developers do not waste the potential of these cards, by watering down their graphics engines to suit the consoles.
     
  5. V3ctor

    V3ctor Tech addict...

    Joined:
    10 Dec 2008
    Posts:
    583
    Likes Received:
    2
    Hope nVidia nails it this time... Don't get me wrong, I have a 4870, but nVidia can't just live on renamings and on bad choice architectures (GT200 is good, but too big)...

    If nVidia fails this one, they will be in trouble... :s
     
  6. Goty

    Goty New Member

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    I've already got a CPU, I don't need another one!
     
  7. Flibblebot

    Flibblebot Smile with me

    Joined:
    19 Apr 2005
    Posts:
    4,619
    Likes Received:
    140
    But if the GPU is capable of managing "the world", including physics and anything else that affects that world, then the CPU could be freed for other tasks such as more realistic AI.
     
  8. SuperNova

    SuperNova New Member

    Joined:
    13 Nov 2007
    Posts:
    19
    Likes Received:
    0
    Interesting approach but i wonder if nVidia isn't focusing a bit to much on Larrabee... It will take some time for everything to be programmed to use all this power (just look at when dual core cpus came). To focus on this now might be a bit optimistic and potentially take focus form other, say pure game-oriented parts. They usually work parallel with new architectures so i wonder how recent they charged their plans to match Larrabee (if thet did).
    I bet AMD if focusing a lot on that shared frame buffer for their GPUs. If AMD solves that and make a great and cheap gamingcard they will stal market shares and gain more money. Making it easier to invest in future architectures when the software is closer to the market.

    But it will be the software (development) of software that decides which architecture to go for, unfortunately its often a slow process. If AMD, nVidia and Intel had MIMD as a main focus there wouldn't be much of a problem though. because it would be equal for all and speed up the development.
     
  9. Goty

    Goty New Member

    Joined:
    13 Dec 2005
    Posts:
    411
    Likes Received:
    4
    More extraneous processing means less power for raw graphics, though.
     
  10. Redbeaver

    Redbeaver The Other Red Meat

    Joined:
    15 Feb 2006
    Posts:
    2,054
    Likes Received:
    34
    funny, the way i look at it, ATI...err...AMD is going to be in trouble if they dont come up with something groundbreaking...... im not talkin bout "gamers" community, but overall PC-users.... theyre all on nvidia.... thx to the marketing budget.....

    since radeon9800 family i havent seen a single ATI pr...er... AMD... product that can take a clear shot at nvidia's lineup.

    in my perspective, nvidia can afford tinkering around with this technology. and larrabee isnt even a threat. so here's for their best success in (finally) getting some fresh stuff out of the oven... im bored.
     
  11. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    interesting read on that blog.. either way nvidia is going to be in trouble.. I know if I had a choice to put everything under one heatsink- that's the way I'd go- I'm afraid nvidia is gonna get owned in the long run..

    but I am glad to see this changed the r&d from milking to- we gotta get it in gear.. might see some good stuff next year
     
  12. sheninat0r

    sheninat0r What's a Dremel?

    Joined:
    28 May 2007
    Posts:
    696
    Likes Received:
    7
    The HD 4850 and 4870 are definitely very competitive with nVidia's stuff.
     
  13. j_jay4

    j_jay4 Member

    Joined:
    23 Apr 2009
    Posts:
    515
    Likes Received:
    14
    I'm pretty sure the 4870 X2 had the performance crown for a while too
     
Tags: Add Tags

Share This Page