1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Does Nvidia have a future?

Discussion in 'Article Discussion' started by Sifter3000, 20 Aug 2009.

  1. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    I am sorry but anyone who praises TWIMTBP as good thing is a short sighted fool.

    you know what nvidia does to "support" games with this program? they force dev's to code in ways that favor older less efficient architecture in exchange for nvidia money. Instead of coding for ATI's more efficient model of 5 ops per cycle they code for the 3 nvidia does and 2/5 of the ati GPU computing power is left sitting on its arse, essentially TWIMTBP does nothing but slow down the industry and allow nvidia to stick to rehashing its inferior architecture into more and more monolithic dies - the apex of this thank god was the gt200 and maybe they will actually innovate for once since g80.
     
    Last edited: 23 Aug 2009
  2. tejas

    tejas What's a Dremel?

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    Hmm its called business.... If ATI don't like the state of affairs they can pay developers themselves to use their hardware. They are already doing it. The new FPS Wolfenstein is an AMD/ATI game as well as the upcoming Command and Conquer 4. I don't see you crying foul about that now.

    Using your logic I suppose Intel should have been punished when Core 2 with is "old inferior FSB technology" was destroying Athlon X2 and Phenom I on all performance fronts, since the FSB is an old and inefficient technology that it still handing Phenom II its arse. Oh and yes Intel are also pretty aggressive with their game developer relations like Nvidia. I suppose that annoys you as well.

    Funny how Nvidia's "old" technology is still as powerful as ATI's "so called more efficient technology". Maybe ATI should get off its behind and be more proactive with devs to expose their Teraflops of power in games and keep their 800 VLIW ALUs always fed with data.

    Simply TWIMTBP shows that Nvidia actually gives a damn about their customers and their gaming experience. ATI are following suit and starting to support games again as well. Have a look at AMD Game website and go to the Wolfenstein website and see that it is an AMD Game.
     
  3. SNiiPE_DoGG

    SNiiPE_DoGG Engineering The Extreme

    Joined:
    14 Apr 2009
    Posts:
    533
    Likes Received:
    39
    so paying developers to use 3 ops per shader isnt wrong? when if nvidia hadn't been there the game would have been coded to use 5 because thats the highest standard (look at grid for an example of that) - you can see it in the games where ATI absolutely destroys NV, those games have been written for ati hardware, and its not that the nvidia hardware does any worse than ususal, it just doesnt have an unspoken advantage.

    Tell me this, would it be OK if: the maker of louisville slugger bats for major league gets payed by the Yankees to put lead in the bats of the Redsox so that they cant hit the fastest pitches?

    its exactly the friggen same, I dont give a dman about business really, the TWIMTBP tactic is

    a) detrimental to the advancement of GPU technology and performance

    b) a downright dirty lying cheating tactic with no other motive than to line nvidia's pockets and let them be the lazy GPU maker they have always been (although ATI is doing a good job of taking advantage of their lethargy)
     
    Ficky Pucker likes this.
  4. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    I can't really say that it's a terrible idea and that it's a cheating tactic. I mean they're here to make money, and it's a sound investment. A very very sound investment. And if you don't give a damn about business then you're not getting the point, this IS business, they're not here to make you happy, there here to get as many advantages as possible to beat the competition, and in reality this is a good thing, as this SPURS competition and makes ATI even stronger as they will have to work harder to get the upper hand.

    What does that mean? Well simply, more innovations. If the ATI has survived this long and has shown they can beat Nvidia with unfair odds, it's clear that they have the ability to do so. And making it harder for ATI will just ultimately make even more innovations as they have to catch up.
     
  5. tejas

    tejas What's a Dremel?

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    @SNiiPE DoGG - Frankly GPU advancement only occurs along economic lines particularly in light of the economic problems in the West. Nvidia Tesla GT200 architecture is just as advanced as ATI RV770 architecture. Both achieve similar levels in a different way. GPUs will only advance as long as there is a need for that power. Moore's Law is not as applicable here as it is in CPU's. Also you are talking garbage as Nvidia Quadro and Tesla solutions are the market leader in GPU computing and the playing ground is totally fair there. You need to admit that AMD have had their finger up their arse and they are to blame for the position they are in...

    Thank you Elton. What this chap above us does not understand is that Nvidia and AMD are businesses out to make profit. Not to make you happy. They have shareholders to think about and not just a tiny minority of gamers who want more speed for their games. Nvidia are doing nothing wrong. This fellow above (not Elton) should really complain more about Intel as they have been convicted of anti- competitive behaviour.

    The discrete GPU market is far fairer than the CPU one and AMD are starting to get back into their Game Dev relations with renewed purpose. Frankly we will see a spate of AMD optimized games and Nvidia TWIMTBP games and at least we have a decent choice for GPUs. In the CPU sphere it is a lot more one sided.

    Ultimately ATI will have to convince more users to go with them when RV870 shows up next month. If they can wrestle back more market share then I don't think developers can afford to ignore them anymore.
     
  6. DbD

    DbD Minimodder

    Joined:
    13 Dec 2007
    Posts:
    519
    Likes Received:
    14
    Mmm, lot of anti nvidia, pro ati around here. Imo those "lame ass 3d glasses" will prove more important then DX11 in the next year - because those glasses work with the console ports (i.e. 95% of games) where as DX11 won't (the game is a console port, i.e. DX9c). Once we all have 120hz monitors (because they are better for gaming even if you don't want 3D), and the cost of the glasses go down a bit then the cost of entry is very low.

    Physx - obviously hw physics will take off in the end, but haven't yet. As of right now they just add a few pretty effects, that said it's still more effects then DX11 is likely to add in the short term (mainly because it seems to be pretty easy to add a few physx effects to your console port, and nvidia are actively helping developers do just that).

    TWIMTBP - obviously a good thing for nvidia, and game developers. They want their game to run well, most gaming computers have nvidia graphics, hence the fact that nvidia provide a programme for it suits everyone but Ati. However that is Ati's fault for not producing an equivalent. What it mostly means is you can expect 3d glasses, and physx to be well supported in games because nvidia will have helped developers add them.

    As for Ati being first to DX11. That seems very likely, but also likely that it'll only be a couple of months, and the nvidia card will be faster when it arrives. This is very different from the 8800 to 2900 case, where the 8800 was out much earlier, and when the 2900 finally arrived it was slower, and more power hungry.

    tbh I don't think Ati is the real threat to nvidia, it's more the way the market develops with regard to the way the cpu and gpu are changing (merging). For that you need to look at Intel. Nvidia have fired the first shots with CUDA and gained the initiative, but Intel are very big and very powerful, they might manage to shut nvidia out yet. That said it might go the other way - with more machines using small cpu's + a gpu (i.e. tegra/ion). In which case arm + nvidia win, and Intel are the ones with their backs to the wall.

    In either case AMD/Ati is looking weak - with huge debts they just don't have the money to invest. That millstone seems to mean they don't have the same opportunities to spend on developing new technologies like the latest intel cpu's, or tegra, or CUDA. Sure they react and produce something decent, but there's too much chasing, and not enough taking the lead.
     
  7. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    Interestingly you're a bit too pro Nvidia. To be honest, I think Tegra and CUDA while an awesome addition, is an absolute waste. They should be focusing on GPUs it's what they do. Meanwhile ATI is an even bigger threat now that it can freely choose to use x86-64 as it's merged with AMD. And while Snipe was a bit extreme, TWIMTBP program was for advertising intents and for perhaps a bit of incentive for developers to program for them.

    As for the Glasses, they're flat out useless. There's no point, console games already look a bit crap as it is, and that they cost a hell of alot of money. You also have to realize that Nvidia cannot make a CPU as it is, well not one on regular PCs anyways. And perhaps you might not realize this, but the PC market is quite big, very big. In fact big enough that if they were to ignore DX11 and go with the glasses they'd lose enough market share to needing to use the same pennies ATI uses for R&D.

    In short: This article isn't about Nvidia declining, rather about them losing focus and just making a bunch of random if not mediocre(as far as I can tell) products.
     
    TheMusician likes this.
  8. ChainsawBunny

    ChainsawBunny What's a Dremel?

    Joined:
    28 Jul 2009
    Posts:
    7
    Likes Received:
    0
    For those saying there are no DX11 games in production, yes there is.

    1. DIRT 2 by codemasters has been delayed for PC as it is DX11.
    2. I bet there are quite a few more games in production that will be using DX11, that are not announced yet.
     
  9. Byron C

    Byron C Multimodder

    Joined:
    12 Apr 2002
    Posts:
    10,003
    Likes Received:
    4,629
    Interesting article I thought (yeah, I'm a bit slow to catch up on these things!)

    Ever since the arse fell out of 3dfx, I've pretty much always been an nVidia man. At first it was purely a financial decision - IIRC, the Riva TNT2 was simply cheaper than the Voodoo3. Up until that point, 3D gaming for me had consisted of Quake and it's variants/sequels rendered in software, so the TNT2 opened up a whole new world for me. ATI cards of the time couldn't compete with nVidia (IMO at least); although ATI excelled with video in/out cards, such as the ATI All-in-Wonder series. I think I later upgraded to the GeForce 2MX, then the GeForce4 MX 440 - purely on the strength of the brand. After that point, I got myself a laptop with a Radeon Mobility 9700 - by the time, I had moved on to console gaming and 3D graphics weren't really an important requirement - I just wanted a laptop. I stuck with that for many years...

    It was only earlier this year that I decided I need to build a PC again, so my first choice went to nVidia for graphics - purely because it was a brand that I trusted. I did a little research online, and decided on the 9600/9800 series, as I thought they'd offer good performance (compared to what I was used to, which was nothing!), good value for money and DX10 support. Not quite sure I'm going to be upgrading any time soon (I'm still not hardcore enough to part with over £150 for a graphics card!), but it is looking like I might be jumping from the nVidia ship.
     
  10. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. What's a Dremel?

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    Nvidia is now playing second fiddle to ATI.
    While Nvidia was patting itself on the back and tooting it's own horn ATI was in the gym preparing for war.
    Nvidia made great graphics cards then got the idea that they could sell anything because they were Nvidia.
    Not to mention a cult of fanboys.
    They bought Ageia and then had the balls to tell the industry "this is the next big thing". Nobody listened. Physicx is OK and can now be done through software and not a $300 dollar card.

    Nvidia while searching for other markets to capture forgot it's core and just kept Frankensteining it's older cards to make new cards. For the most part all excellent products but the industry and technology changes. AMD buys ATI.
    Nvidia acts like a king amongst peasants and as far as Intel is concern there is only one KING. Intel sues Nvidia and the blissful relationship is now sour. I guess Intel like ATi got tired of Nvidia laughing at it video cards. Tegra and what ever else Nvidia is doing is cool but won't make Nvidia rich like gt8800 or it's GTX series. Nvidia won't survive with out Intel, so kiss and make up.

    Amd/ATI unleash the 4870 4890 4770 4870 x2 and shows not only Nvidia but the world that they are serious contenders in the video card market. Then AMD / ATi release the HD 5000 series and have essentially become the NEW leader in graphics while Nvidia has yet to answer back. Not only that but has yet to draw out a future plan in regards to Video cards and the industry.

    All the while Nvidia the once giant scrambles to get itself out of the same hole that it use to piss in, ATI has been hard at work and dedicated to its core product. Nvidia still has the money to make great products but the industry has changed and without Nvidia either making CPU's or strengthening it's relationship with INTEL then NVidia will just get bought out by Intel and cease to exist.

    Nvidia instead of bribing developers and hardware review magazines you should have been making new technology. Being the company that championed the CPGPU idea you should have finished it, now Nvidia may legally be locked out of the new technology. I'm sure Nvidia will come through but will their attitude towards the business change as to respect your core product and audience. If not then ATi's HD 6000 series will end Nvidia forever seeing how ATi is just correcting fabrication problems before full scale production ensues on it's new HD 6000 series card built around GPCPU technology.
     
  11. Anakha

    Anakha Minimodder

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    NVidia is in the same position ATI was in after NVidia released the 6000-series, which is the same position that NVidia was in when they released the GeForceFX series, and ATI kicked their ass with the Radeon 9800. The GPU industry is cyclical, with each manufacturer working on their own timescale.

    ATI's 5900 series are, essentially, souped up version of their older cards (You know, like NVidia did with the 8800/9800/GT280). They've added "Eyefinity", a handy little framebuffer-splitting trick, but there is nothing really revolutionary there.

    NVidia are building something totally different, a GPGPU card that happens to also do graphics, rather than a graphics card that happens to do GPGPU, and these things take time.

    I believe NVidia's stance now is "Welcome back, ATI, you've finally managed to get yourself into the lead again. But watch your back, 'cause we're coming to blow you out of the water again, just like we did with the G80".
     
  12. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. What's a Dremel?

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    And those lame 3D glasses. Do you realize 3D is a niche fad that comes and goes every ten years. Multiple industries have been trying to make money off 3D since the early 1930's. Nvidia needs to apply the technology to EYEFINITY which shows alot of promise.
    Being able to see so much of the playing field (Like in that movie Gamer) is an awsome experience. 20" inch monitors are around $100 so EYEFINITY is becoming a reality not only for games but for viewing live sports.
     
  13. Anakha

    Anakha Minimodder

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    Oh yes, let's laud Triplehead gaming. Sorry, Eyefinity. It's amazing, Matrox came out with the whole "Triplehead gaming" schtick Waay back with their Parhalia cards, and nothing every happened with it. Big surprise.

    If 3D is a "Niche fad", then what's with Dolby 3D and RealD-3D getting in on the act? How comes almost all the huge blockbuster movies are coming out in 3D? Why do you think LCD TVs are pushing to 120 and 240hz? What standard has just been ratified (with NVidia on the board) with the Blu-Ray standards council? I'll give you a clue, it's not Triplehead gaming.

    Oh, and incidentally, Eyefinity is actually breaking part of how Windows is meant to work. It creates one "Virtual" screen across the three displays, but that means things like maximising windows to a screen doesn't work any more.

    NVidia cards support 3D acceleration on all "heads". Look at Total Commander - That works properly with Windows' screen system, by (correctly) opening a window on each screen, or opening a window that spans the screen borders.

    So if you want "Eyefinity" on an NVidia card, buy yourself a Matrox TripleHead2Go box, and you're done. Wooo... Groundbreaking... *sigh*.
     
  14. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. What's a Dremel?

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    Not this time around. The 5000 series already implements AMD stream technology where programs not running video games can offload some computations to the GPU of a HD 5000 card. This means some developers (Microsoft Apple) have already played around with this and are programming for future ATi releases. Plus If Nvidia doesn't acquire a licence from Intel and AMD then it's GPCPU will be useless because it wont be legal to work on any NEW x86 or X64 motherboards. I'm sure they will get a license and pay royalties to AMD and Intel.

    Intel is working on a GPCPU and so is AMD/ATi both are CPU manufactures and are holding all the keys to any new CPU or GPCPU technology. Nvidia has to go through Intel or AMD to get GPCPU technology out to the market. Either they reform their bond with Intel or pay AMD royalties.

    That's why Nvidia has been so busy going after alternative markets, Tegra 3d glasses hospital imaging machines, They thought Physix was going to work but it can be done through software or a chip addition to a video card.

    So no this time around it's totally different because it's not about just improving graphics no more but about CPU GPU integration and unless Nvidia starts making CPU's then they are going to have to PAY AMD.
     
  15. Anakha

    Anakha Minimodder

    Joined:
    6 Sep 2002
    Posts:
    587
    Likes Received:
    7
    Hello, CUDA, greetings DirectCompute, glad to see you're finally catching up, ATI.
    You mean like the DXVA conversion acceleration that already exists in Win7 for NVidia cards (Thanks, again, to DirectCompute)?
    That is not decided as of yet. AIUI, there is still a court-case going through now between Intel and NVidia over the scope of their license.
    And as HyperTransport and PCI-Express are "Open" standards (with NVidia on the founding boards for both) there is little need for a license there.
    You mean Larrabee? Don't make me laugh!
    Really? I'd not heard of that.
    Not quite. Tegra is a sure sign that NVidia don't *need* anything more. But NVidia don't *Need* to be in the CPU or GPCPU field. They're a graphics card maker. As long as PCI-Express exists, they will be more than able to compete. And while they may not be able to make motherboard chipsets any longer (And, be honest, apart from Ion, just how many chipsets have NVidia made over the last few years?), they really don't need to.
    NVidia already has a GP*PU technology out there, and Fermi looks to be a next step in this evolution. Why restrict the amount of processing you can do to how many clocks you can fit in the CPU socket?
    It's not just PhysX, that was just a showcase of what they could do. And while Ageia thought their PPU was the cat's ass, NVidia knew differently. They knew PhysX would already run on the CPU, but it didn't HAVE to. Offloading that load from the CPU to the unused capacity on the GPU meant the CPU was free to do other things, like AI or sound.
    Besides, NVidia have already got GPGPU working for many, many applications with CUDA and DirectCompute leading the way, and with the latter being a MS-created standard API (like DirectX) to work across all cards, though only NVidia have drivers that support it.
    For Intel it may be about CPU and GPU integration, but for NVidia it's not. They don't care what you use as your main CPU, their technology is not dependent on it. All they require is a standard interface (PCI-Express, say), and they're golden.

    Oh, and incidentally, it sounds like Fermi has the processing power to be able to run, say, Bochs or QEmu "Natively". Which means you can have your "graphics card" emulating a CPU, and running an OS. There was another company that was quite successful doing things like that. Their name was TransMeta. Perhaps you should look them up.
     
  16. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    To be fair, Nvidia's just heading into the GPGPU route because they *hope* that it's going to be the future.

    A smart move perhaps, but at the rate it's going they're going to have some massive losses unless they can convince the consumer populace that it's worth it.

    I like CUDA and all but there's still not enough programs to justify a purchase of a more expensive GPU.
     
Tags: Add Tags

Share This Page