1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Nvidia Analyst Day: Biting Back at Intel

Discussion in 'Article Discussion' started by Tim S, 14 Apr 2008.

  1. BlueOcean

    BlueOcean What's a Dremel?

    Joined:
    5 Feb 2008
    Posts:
    157
    Likes Received:
    0
    This person he is a ceo. He own Nvidia corporation.
     
  2. xtremeownage

    xtremeownage What's a Dremel?

    Joined:
    27 Jan 2008
    Posts:
    13
    Likes Received:
    0
    Bladestorm YOU are so right. Its annoying how they release cards every 3 months or so each time being better than the previous card. I stayed with my 8800GTS 640mb for a long time and overclocked it slightly but only got marginal performance. I think games should be made for the first High End card with a GTS and GT mark from Nvidia e.g 8800GTS. That way everyone can enjoy the game but for better frame rate GTX are optional. GTX should be more powerful better yet they should drop GTX after 1 year and replace them with Dual solutions (GX2s). The reason being consoles stay with the same graphics cards for years and console gamers enjoy all there games with no system specs problems. The lifespan of consoles is 4-6 yrs. In that time the number of games produced is massive and are pretty good. For example the new upcoming Star wArs the force UNleashed looks pretty good on a GPU in the xbox360 thats almost 2 years old. Its sad to see some games come out and cant be played on high end systems with all the Visuals turned up. When they say Sli is not it for games i clearly agree on that fact because i cant afford sli because i would need a new motherboard. However the 9800GX2 gave me the opportunity to experience a level of gaming i have always wanted thats resembles sli & is faster. In terms of performance i experienced 2 times the performance on 1 card. I was pretty impressed and this card is my best buy from Nvidia to date.
    SLI should be replaced by single GPU solutions because they are efficient at processing data. Those opting for Quad then its optional but buying 2 cards should atleast mean we get a 15% discount OMG nvidia . Your stuff is too expensive.
     
  3. metarinka

    metarinka What's a Dremel?

    Joined:
    9 Feb 2003
    Posts:
    1,844
    Likes Received:
    3
    so the war is Intel VS Nvidia? IG vs discrete, what ever happened to ATI/AMD we need all the extra competition and innovation we can get. They've already mentioned sandbagging on Nvidia's part, sometimes being on top can make you lazy (like AMD trouncing INTEL with x2)

    at any rate I take most things intel says with a grain of salt. They have a really smart PR team and can create buzz where nothing exists such as centrino for laptops which was what? Nvidia cpu+chipset+wireless controlle (I'm forgetting the 3rd part) Big deal. Same with how hard Intel pushed BTX a very terrible idea.
     
  4. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Intel will be releasing a discrete GPU in a couple of years (that's what all this Larrabee talk is)... Basically Intel will play down the viability of current GPUs in the future until its convenient for it to say otherwise. Having said that, Nvidia have done the same (we won't make a CPU, we won't make a CPU... then they launch APX2500, which is a system on a chip for mobile phones and includes CPU logic).

    Basically, they're both as bad as each other. :)
     
  5. sbenrap

    sbenrap What's a Dremel?

    Joined:
    17 Apr 2008
    Posts:
    2
    Likes Received:
    0
    I'm quoting the last part of the article:
    "Huang doesn't seem fazed by Intel's push into his territory at the moment, but he said he remembers the scars he got following the release of the GeForceFX architecture"

    Can someone please elaborate what "scars" he's talking about?
    I don't remember any big deal between nVidia and Intel when the GeForceFX was released...

    Thanks :)
     
  6. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    There were no scars from between Intel and Nvidia... just the fact that GeForceFX could have quite easily been the end of Nvidia had it not pulled itself out of a hole with the 6-series. :)
     
  7. sbenrap

    sbenrap What's a Dremel?

    Joined:
    17 Apr 2008
    Posts:
    2
    Likes Received:
    0
    Thanks Tim S.

    I guess wasn't too much of a gamer back then to realize the problems of the FX series (just read up on this over at Wikipedia).
    Guess I was too busy studying at Uni to notice...
     
  8. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    No problem :thumb:
     
Tags: Add Tags

Share This Page