1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Does Nvidia have a future?

Discussion in 'Article Discussion' started by Sifter3000, 20 Aug 2009.

  1. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
    In the first of a regular series of articles looking at the giants of the tech industry, we focus on Nvidia - it's had a torrid time over the past 18 months, but in the past, it's always come back stronger when under pressure, but what does the future hold this time for Nvidia?

    :)
     
  2. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,579
    Likes Received:
    229
    link to article would help :)

    currently, seeing ATI's Dx11 card will release on time to Win7, it may seem nVidia have lost this round.

    but the new era of ray tracing is coming, and i think nVidia will win the first round of ray tracing just like 8800GTX.

    it's cycles, it's currently the red team's advantage.
     
  3. Krikkit

    Krikkit All glory to the hypnotoad! Super Moderator

    Joined:
    21 Jan 2003
    Posts:
    23,448
    Likes Received:
    368
    An interesting article, I look forward to the next ones.

    Personally I can't wait for a Tegra-equipped smartphone like the ZuneHD. That looks like a special bit of kit too.
     
  4. dec

    dec [blank space]

    Joined:
    10 Jan 2009
    Posts:
    315
    Likes Received:
    7
    nvidia better have a future, a seeming monopoly for ATI/AMD is bad for prices and they will have a reason to "let off the gas"
     
  5. Skiddywinks

    Skiddywinks Member

    Joined:
    10 Aug 2008
    Posts:
    929
    Likes Received:
    8
    I never knew quite what nVidia went through in terms of TWIMTBP. But having read about what they actually do to get that badge on games, I have to say I am impressed. Especially if they lose millions on it in the short term.
     
  6. pullmyfoot

    pullmyfoot superbacon

    Joined:
    1 Oct 2008
    Posts:
    198
    Likes Received:
    0
    it would be interesting if NVIDIA got a X86 license. I would assume that they could come up with a decent CPU pretty quick with all that know how they have. I am also looking forward to the day when VIA starts actually being able to compete in the low-mid range areas for CPUs.

    That would make for a nice 4-way CPU arena (and 3-way with Intel's Larabee). Dream on.
     
  7. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    I take exception to the 8-series being regarded as great, for the simple reason that the 8800GTX was the only worthwhile card to have. The 8800GTS was average, and the 8600-level cards were worthless. As the article states, I consider the 8800GT to be a different generation (8.5-series maybe?). An indication of just how average the 8-series was is that I hung on to my 7-series until the HD4870, and I didn't feel the upgrade prickles.
    The 6-series was incredible after the abomination of the 5(FX)-series. So exciting, the generation performed from the 6800Ultra all the way down to the 6600GT. My 6600GT was the best graphics purchase I had ever made until the HD4870.

    I think the RV770 was an incredibly risky move that paid off brilliantly. As a long time nVidia man, I have to just admire the balls of whoever made the strategy decision. It could so easily have been the biggest bonehead move of the decade, but it caught nVidia hopping and Whoop-Assing instead of producing.

    Maybe I've become too informed about the graphics industry (I know. How is that possible?), but I've gone off nVidia as a company recently. Is it just me, or does the new nVidia sound like the bragging, school-yard ******s of the industry? I just wish Jen-Hsun would put his cans of Whop-Ass away and get his act together. Oh, and put away the sledgehammers away as well Jen-Hsun, the die sizes are doing you no favours.

    As a Linux fan, and someone who wishes ill on Microsoft, I hate to say that people have been writing Microsoft off for quite some time and none of the predictions have come true. I see a disconnect between mobile device OSes and PC systems, and OSes and software written for one won't mean people switching the other as well.

    I can't see Tegra making nVidia bags of money either, for simple reason that nVidia can't gouge the OEMs for large margins (Only Apple can do that :) ). These aren't large margin items, and the market, while potentially significant, isn't going to be that huge. Most people still don't want/need HD media capable smart devices. At most, nVidia can hope for a couple of bucks per chip sold. How many iPhone/iPod/Zunes/netbook chips will they have to sell to make up for the $200m dump they took over the dying G86s?

    CUDA and GPGPU needs a breakout application to be successful.
    Physx hasn't done it. With Direct-X physics it doesn't look like it is going to.
    Folding isn't it. Although the cause may be worthy, I don't want to offend participating forum members, but it is little more than an e-peen competition for uber-enthusiasts.
    Even multimedia encoding isn't it.
    When it does come, I don't think it will come labelled CUDA either. It'll come out labelled Open CL, and nVidia isn't going to have a lock on that are they? It isn't just cross-platform it is cross-processor as well. The same competition nVidia has in the GPU space, they will have in the GPGPU space and more. nVidia has just gotten ahead at this point, which won't matter in the long run because there is nothing yet of substance that they can use to take advantage of their lead.

    Don't get me started with nVidia chipsets, which peaked with the nForce 2 and have slowly gone downhill (don't get me started on Soundstorm, I'm still livid over what nVidia did).
    nVidia, get a strategy and stick to it! Do you want to be a part of this market?
    If the answer is yes, then commit to putting out more than relabelled chipsets that half-heartedly tick feature boxes. Try harder. For the love of Zombie Jesus, put some passion back into it!
    If the answer is no, then do something to save that all important Chinese face, and then put a bullet into your chipset team. Maybe you could put them into working on GPUs, or Tegra, or whatever your latest marketing name is this month.

    AMD may be broke, pay for their R&D with the change they find in payphones, and wear cricket boxes in their pants so that when Intel comes around and kicks them in the nuts it doesn't hurt so much, but they give the impression they have a plan for the future, the drive and passion to make it happen, and the flexibility to innovate. If only they had more money. Imagine what they could do with Intel's R&D?
     
    stonedsurd likes this.
  8. Sifter3000

    Sifter3000 I used to be somebody

    Joined:
    11 Jul 2006
    Posts:
    1,766
    Likes Received:
    26
    LOL, awesome description :)
     
  9. [PUNK] crompers

    [PUNK] crompers Dremedial

    Joined:
    20 May 2008
    Posts:
    2,909
    Likes Received:
    50
    great article, the stuff about the way its meant to be played is definately the most interesting part. perhaps another article on what companies do to get their badge up at the beginning of games? would like to see what intel have done to get Runs Best On....
     
  10. Adnoctum

    Adnoctum Kill_All_Humans

    Joined:
    27 Apr 2008
    Posts:
    486
    Likes Received:
    31
    $$$
     
  11. francois

    francois New Member

    Joined:
    13 Jul 2009
    Posts:
    21
    Likes Received:
    1
    I've gone off nVidia recently as well , especially hearing about the design flaws which led to the high failure rate of G86 problems.ATI have just put out cards which have proven to be better value for money for the most in the last 12 months , and nVidia still are relying on G92 to counter them (which has done a good job I'll admit).

    They really are taking a massive risk with GT300 if it's as big as said in the article ,I'd have thought they learned their lesson but seemingly not...
     
  12. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    I run a 4870X2 im my main rig and I do love it. But I still have warmer feelings for my GTX285 in my other rig. TWIMTBP programme is head and shoulders above anything ATI have and CUDA is superior in implementation compared to ATI Stream.
     
  13. B1GBUD

    B1GBUD ¯\_(ツ)_/¯ Accidentally Funny

    Joined:
    29 May 2008
    Posts:
    3,235
    Likes Received:
    375
    It's the re-badging that gets up my nose... we want bleeding edge and we want it now... not sume re-marketed hype and little substance
     
  14. yakyb

    yakyb i hate the person above me

    Joined:
    10 Oct 2006
    Posts:
    2,063
    Likes Received:
    30
    a full on interview with a TWIMTPP engineer/developer would be great with some more details into the exact support given.
     
  15. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Always a purchaser of ATI, if only because their prices were cheaper.

    That said, my 2nd card was a HD4850 and my first was a X1550... But Nvidia's risk taking is perhaps too excessive, it's not revolutionary and it's not going to help gaming performance, which is what most people who buy GPUs want.
     
  16. Vergil_117

    Vergil_117 New Member

    Joined:
    22 Jun 2006
    Posts:
    71
    Likes Received:
    0
    wuyanxu ATI releasing a DX11 card (if) at Win7's release wont really do much damage to NVIDIA since there is 0 DX11 games being released and most sane people don't go out and get a brand new $500 dollar card just cause its new (note sane).
     
  17. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    Ah but having an early footing surely helps.

    Look at the 8800GTX. It was the first DX10 card, and there were not many DX10 games at the time, in fact barely. But it won out. Why? Because it came out first, and because it did, the competition would have to work faster and rush their design...so what happened?

    The 2900XT came to being.
     
  18. knyghtryda

    knyghtryda New Member

    Joined:
    2 Jan 2006
    Posts:
    101
    Likes Received:
    0
    I've bounced between ATI and Nvidia for quite few years. My first "real" graphics card was a geforce 2 GTS, then a geforce 3, then a radeon 9500 pro (one of the best bang for buck cards of all time...), then 7800GT, then a 7950GT (yeah... sold that 7800GT pretty quick...), and now 8800GT. At this point if I decide to build another machine I'm probably going to go back to ATI simply because for the money they have the better card. Being that I'm not gonna be spending 295GTX money, everything in the ~$200 range is pretty much a toss up and ATI just happened to land slightly ahead (4870x2, 4890, etc). I do hope both companies stay competitive though because this is the only way we'll see any real innovation keep coming from them.
     
  19. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,579
    Likes Received:
    229
    as Elton's reply. with these new technologies, it's more of whoever wins the race. nVidia had been boasting their "first Dx10 card" advertising for a while, right up to 9800GTX.

    to expand on my ray tracing part, the reason i think nVidia will win is because their shader pipelines are more general purpose rather than ATI's specific approach. so, it will be easier for nVidia to adopt to a newer kind render technique.
     
  20. dicobalt

    dicobalt New Member

    Joined:
    21 Mar 2009
    Posts:
    169
    Likes Received:
    2
    Why can't Nvidia get licenses? Sounds like an antitrust case is brewing.
     
Tags: Add Tags

Share This Page