1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia: Without TWIMTBP, PC gaming would be dead

Discussion in 'Article Discussion' started by Tim S, 3 Oct 2009.

  1. Saivert

    Saivert Member

    Joined:
    26 Mar 2005
    Posts:
    390
    Likes Received:
    1
    well. cos it's so much more fun to Type out the way it's meant to be played. 'cos you know. gamers have never seen that before. everybody uses hacks to remove those splash screens from their games and they put a sticker over it on the DVD case. </sarcasm>

    Anyways... people will always buy the card that is the most value for money (except blatant fanboys of course, but those are morons).
    This changes from being ATI to being NVIDIA back to being ATI again all the time. So who really cares? Just be happy you get to witness graphics evolution.

    You can be pissed all you want, it doesn't change a thing.
     
  2. SimoomiZ

    SimoomiZ Member

    Joined:
    2 Feb 2008
    Posts:
    65
    Likes Received:
    2
    Nvidia clearly believe this movement to consoles is now unstoppable... the battle with Studios, effectively lost. Who can really blame them with PCGA members like Epic? Hence the repositioning towards compute-intensive revenue avenues. If their strategy succeeds what are the odds on their expensive TWIMTBP support surviving and with it PC gaming itself? What ever the result for Nvidia's strategy, clearly the next gen consoles can't come soon enough, as PC HW is now running way ahead of PC gaming titles, which seem for the most part to be simply console ports .

    I mean, the very idea of sli'ed Fermi cards seems quite ludicrous at this point in time, based on the meager PC game offerings around right now. Most of which run quite happily on (now old) sli'ed 8800gtxs. Speaking of which, an 8800gtx was around £450 at launch and seemed like a good, albeit expensive deal at the time. Looking now at the current state of PC gaming a £450+ fermi card looks like a very poor investment unless new (free) Nvidia applications and next gen games appear fast, so all that Fermi computing power can actually be used in desktop computing
     
    Last edited: 4 Oct 2009
  3. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    honestly I think nVidia if anything is just hurting the market in general with how much bribing, I mean funding they do to corner markets and make it bias towards their cards, not to mention the tech hold up they cause, case in point 10.1 ~_~ that whole the way its meant to be played thing needs to end, gaming industry would be better off as they could unify the standard to work on all cards like for example DX11 and OpenCL instead of the crap PhysX and so on.
     
  4. SimoomiZ

    SimoomiZ Member

    Joined:
    2 Feb 2008
    Posts:
    65
    Likes Received:
    2
    Tend to agree, in its current predicament the last thing PC gaming needs is the PC gaming equivalent of the recent HD standard format war. It's really in their interests to row together or their boat is sunk.
     
  5. general22

    general22 New Member

    Joined:
    26 Dec 2008
    Posts:
    190
    Likes Received:
    1
    Disagree since ATI could set up a similar system where developers are able to test with a variety of ATI configurations and send in an engineer to work with the developers. But they don't have such a system or do not advertise it's existence.
     
  6. cadaveca

    cadaveca New Member

    Joined:
    4 Oct 2009
    Posts:
    7
    Likes Received:
    0
    Oh, Ashu Rege, the Director of Nvidia's developer tech team,

    I got news for you, little man. You VERY little man. How about stop spitting lies and telling the truth.

    TWIMTP is KILLING PC gaming, not saving it.

    Proof:

    Since nVidia released thier first Phys-C driver, stand-alone PCI Phys-X cards stopped working...only when installed with ATI cards.

    How is that "saving pc gaming"?

    Perfect example, of how this statement is nothing but lies:

    Oh really, Mr, Tamasi? WHat about PCI phys-X cards and ATI vgas, and the games released prior to the purchase of AGEIA?

    You broke THOSE GAMES.

    Shall I start the list?

    Ghost Recon: Advanced Warfighter.

    Oh yes, every single game since the first now does not even run, period, with a PCI Phys-X card, and ATI cards, even though it worked up until the 7.13 PHys-X driver. If hardware Phys-X is selected, the game refuses to start. But there's a catch...

    Because newer games that use PHys-X update the API, forget about trying to smooth this one over...you're busted.

    I posted about the problem over a year ago...noone listens. Let's step back to August 13th, 2008, when I found the first driver you nVidia guys deliberately moved to ruin gaming:


    http://www.xtremesystems.org/forums/showpost.php?p=3214282&postcount=2


    Hard proof that nVidia purposely coded software to ruin the gaming experience...undeniable evidence the nvidia does the exact opposite of what they represented in this public statement.

    I submitted a ticket to nVidia on the issue. Response?

    lol. Guess what. It's still broken. For over a year, nvidia has purposely known the problem, and has failed to address it. Since then, every game that features Phys-X, when you install it, breaks other games. ON PURPOSE. Only way to fix it is to re-install windows, and install no driver or game with the "TWIMTBP" logo since the first released nV phys-x driver.


    Using your influence in the software market to affect sales in the hardware market is what got Microsoft fined. Guess who has lawyers prepping a case? Guess who's gonna go out of business?

    You know, the only reason INtel is making graphics cards is to get rid of you guys @ nV? As soon as larrabe is released, and AMD is not left with a monoply when you are goine...bye, bye nVidia!


    I can't wait...I've got tons of evidence here to back up AMD's lawsuit. ANd believe me, it's coming.

    AMD just wasn't complaining publicly...that was a smoking gun.





    lol. Liar.
     
    Last edited: 4 Oct 2009
  7. ssj12

    ssj12 Member

    Joined:
    12 Sep 2007
    Posts:
    689
    Likes Received:
    3
    Ill stick with Nvidia GPUs. I've never had an issue with them. ATI, I always have. So truthfully as long as Nvidia makes GPUs, I'll play PC games. If they stop, well PC gaming can rot in hell.

    If they feel like improving my overall experience by helping developers, I'm all for it.
     
  8. cadaveca

    cadaveca New Member

    Joined:
    4 Oct 2009
    Posts:
    7
    Likes Received:
    0
    Let me just say, buy what you want...it doesn't matter. nVidia is going to disappear, and very soon.

    As much as nVidia wants to spout helping developer baloney, why don't we bring up eyefinity?

    You do realize, that for any game to support those high resolutions, nevermind how many are working now...just by how many have the ability to select them...had everything to do with AMD working with developers? It's nice for nV to say AMD just isn't doing what they are...unfortunately for them, they don't realize just what AMD DOES, nor do they have the ability to actually appeal to the programmers who are actually doing the typing at the keyboards, making the engines.

    Thier strategy is with the suits, not the guys actually programming. Fact of the matter is that noone on the team for nV is actually trained for programming the future, and they are running scared.

    Next-gen is in development...we are begining to see it now...Fermi...isn't centralized on graphics, but gpgpu. DO you wonder why?

    It's far to late for nV to hire the people even, because they in short supply. Most are working on Larrabee.

    Just how is nVidia helping the next-gen, anyway? Where's the future-thinking? Have you watched the Fermi presentation? Biggest joke on the market. Whose buying? If pc gaming would die without them, and the largest market is consoles, where's the next console gpu?

    PS3 and 360 are stale...360 is how many years old? PS3? Both have just hit refreshes...thier time is limited. nVidia has already missed the boat in that market...all they have left is PCs. Running Fermi, the gpgpu chip.

    lol. 80 people just wasn't enough. Not that those nV people aren't working hard...they'll get jobs at Intel and AMD when nV goes under. Other people though...better start looking now.
     
  9. s3v3n

    s3v3n MMO Cold Turkey -fail

    Joined:
    23 Jun 2008
    Posts:
    68
    Likes Received:
    0
    WIthout PC gaming, Nvidia would be dead. Not the other way around.
    Now lets assume TWIMTBP has a big effect on the PC gaming industry:

    - Kill TWIMTBP
    - Less new graphics pushing games
    - Less people out to buy new videocards
    - Less high profit margin videocards for Nvidia/Ati
    - People still making PC games, just ones that better use current tech

    TWIMTBP is meant to push highend gaming hardware requirement adoption, and give consumers a reason to buy new hardware.
     
  10. gavomatic57

    gavomatic57 Active Member

    Joined:
    23 Apr 2009
    Posts:
    5,091
    Likes Received:
    10
    You do realise that Nvidia are the only one of the two GPU manufacturers that has an established GPGPU infrastructure - armies of people developing apps that work with CUDA and eventually OpenCL, meanwhile ATI/AMD are busy racing after the next thing that may come along. They did try releasing Stream but that didn't really get anywhere did it.

    Nvidia were the first to bring working OpenCL drivers and development tools to market. Clicky

    You also need to realise that Apple marketshare is growing and every mac on the apple store comes with a Nvidia GPU.

    You also need to realise that AMD are also up against Intel and losing quite catastrophically and carrying millions of dollars of long-term debt. AMD are apparently on the brink of bankruptcy Clicky

    You also need to realise that 65% of all GPU's used by Steam users are Nvidia, but only 27% of GPU's are ATI. 68% of CPU's are Intel, 31% are AMD and falling.

    Eyefinity is a joke surely? There are only so many people who are going to spend £300 on a graphics card to play a dwindling number of games, but even fewer who are going to buy 6 identical monitors and stick them all together. Just buy one bigger monitor! If you miss the black lines down breaking up the image you can use duck tape!

    As for Larrabee...nvidia are already developing ray-tracing applications Clicky. They don't need to hire anyone because they already have the staff they need. They're far from running scared, they're actually ahead of the curve. Larrabee is Intel's pipe dream and they have yet to show anything that looks like a product. Meanwhile Nvidia can achieve the same things that Larrabee is aiming for using CUDA...with any GPU from the 8800GTS onwards.

    If PC gaming stopped tomorrow, they would still have Tesla and their Quadro line.
     
  11. DOA Draven

    DOA Draven New Member

    Joined:
    4 Jul 2008
    Posts:
    16
    Likes Received:
    0
    It's in Nvidia interests to support PC Games, if PC gaming dies, then so does the need for these high end gaming graphics cards, which no doubt generate considerable profit. The same is true for ATI of course. Otherwise all they are left with is the 3D professional market, and intergrated GPUs which will do most desktop for the majority of people.
     
  12. Phil Rhodes

    Phil Rhodes Hypernobber

    Joined:
    27 Jul 2006
    Posts:
    1,415
    Likes Received:
    10
    I wonder if nvidia make more money out of selling graphics cards, or more out of selling GPUs to manufacturers of other devices. The former, I would expect, on the basis that they can charge us an arm and a leg for a GPU on a circuit board that they'd have to discount heavily in any B2B transaction.
     
  13. dreamhunk

    dreamhunk New Member

    Joined:
    1 Sep 2009
    Posts:
    46
    Likes Received:
    0
    after crysis war head all hardware stocks went crashing so too did the pc market.


    lets see here AMD has 80% of it's hardware in consoles and they are still going bankuprt! IN fact they AMD had to steal money from intel just to stay alive. Yea the them consoles are helping hardware comapnaies a lot what a joke!


    Both the gaming industry and hardware companies will learn the hardway who is the top of the food chain around here!

    we made you we can break you, you don't own us we own you!
     
    Last edited: 4 Oct 2009
  14. ssj12

    ssj12 Member

    Joined:
    12 Sep 2007
    Posts:
    689
    Likes Received:
    3
    I wonder if nV can survive now they are making processors... lol
     
  15. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    bet huang is still living back in the g80 days.. like I heard he was surrounded by japanese chicks 24/7 after g80 release and he use to pop perrier-jouet and take a bath in it.. but nowdays (I heard lolol) he roid raged on the 5870 release- beat those chickens off with a bath towel

    smithers came in and said your gonna have to do something huang- he went into the richmans stare and said- 3 billion! you heard the ntune guy say WTF out loud
     
  16. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    Nvidia are right. PC gaming would be dead as a dodo without them.

    AMD still have not made a profit since they acquired ATI. AMD is hardly a company that I trust for PC gaming's future. The future is GPGPU as is made clear with Windows 7, Linux and MAC OSX Snow Leopard. With Tegra, Tesla and Fermi lines, I dont think Nvidia will need the Geforce line in the coming years for revenue.

    Frankly AMD can go to hell. Intel make the best CPU and when Larrabee/ Nvidia Fermi comes next year Intel will have a superior platform to AMD as well.

    Having said that I know that British people are jealous and dont like when other people like Nvidia do well. Always like to support the underdog asshole with his crap products....
     
  17. cadaveca

    cadaveca New Member

    Joined:
    4 Oct 2009
    Posts:
    7
    Likes Received:
    0
    Why would AMD quash thier cpu lines with gpus? are you silly?

    Really? then how come Stanford was folding on ATI gpus first? nVidia is YEARS behind AMD, hardware-wise....but they got some solid programming...

    Larrabee will be replacing nV in Apple products. Apple would rather have a complete platform, and deal with just one hardware developer...it's so much easier, and cost effective. Intel is not far off from having exactly what apple is asking for. In fact, everything they are doing now is directly related to meeting Apple's needs. Or didn't you know that?:D

    You don't know who is financially backing AMD then. You're not aware of what "asset-light" means...nor how companies are run. AMD is far from bankruptcy...sure, they are a bit short on cash, but guess who is taking over when nV dies? And they've been developing hardware for Microsoft, and have Microsoft's support...while nV's nose is in the air when it comes to DX advancements. Guess who provided the DX11 gpus for M$ to develop DX11 on? Who is working in the consoles? Who is busy developing for those console titles that work on the competitors hardware? LoL. You're pretty comical, you know?

    So? Even Gabe himself will tell you that STEAM does not accurately depict the market...just thier little segment of it...and guess who built up STEAM with buying Valve titles, and putting coupons in the box of every new DX-compliant hardware update? Why don't you ask Gabe why he thinks STEAM shows those metrics, and what those numbers really mean? Not good as statisics, are ya?

    Sure, with many monitors, and bezels, and TODAY's market, it seems foolish...however...here's AMD running the resolutions of the future...today! Working with the silicon panel makers today, to ensure that things will work in the future! Patents are filed for next-gen panels, and nV holds how many of them? again...lol..

    BUT...nV is NOT getting developers working on Raytracing...how can they be ahead of the curve, when they're heading in a straight line, down the same path they've always been headed?

    First it was console gpus. Then motherboard chipsets. Seems to me, Tesla and Quadro doesn't sell enough yet to be truly viable. What's left for nVidia? Ion? Geforce? Tesla? Quadro? Too bad they are losing liscencing agreements left and right...the more they lose, the less important they are. Tesla will replace Quadro...or didn't you know that? That statement alone shows that you're not even aware of what's going on...you are too busy looking at today, when the day's half over! Looks like they got one product...and nobody is really interested! And what are they working on to succeed Fermi? They barely have enough Fermi cards as it is, and had to use mock-ups this week.

    If Fermi was good...ready...we would have seen a real one this week. Huang is visiting TSMC later this week, isn't he? DO you know why? What's he taking over there, that he doesn't want anyone else to see?

    Seem's you're a victim of thier marketing. At least they've got that part right! :D
     
    Last edited: 4 Oct 2009
  18. Silver51

    Silver51 I cast flare!

    Joined:
    24 Jul 2006
    Posts:
    2,962
    Likes Received:
    287
    Dude, did you just raagggee quit from a Left 4 Dead VS match just to post here?
     
  19. cadaveca

    cadaveca New Member

    Joined:
    4 Oct 2009
    Posts:
    7
    Likes Received:
    0
    Does my passion for honesty make you uncomfortable? :sigh: I just hate liars.


    :naughty:

    I don't play pc games much any more..nV killed that fun.

    :D
     
  20. chizow

    chizow New Member

    Joined:
    12 Dec 2008
    Posts:
    24
    Likes Received:
    1
    Tim you might be interested in some of the recent news about Nexus shown at GTC. Its basically Nvidia's GPGPU plug-in for Visual Studio that provides an all-in-one debugger and compiler for Nvidia hardware for all of the relevant gaming API: CUDA C, OpenCL, DirectCompute, Direct3D, and OpenGL. Its pretty clear Nvidia's efforts and support for all things GPGPU far surpasses AMD, and Nexus is just another huge step in that direction.

    http://developer.nvidia.com/object/nexus.html
    http://developer.nvidia.com/object/nexus_features.html

    Its amazing that AMD and their supporters somehow feels a value-add feature implemented by Nvidia for their own hardware somehow detracts or cripples AMD hardware. Its equally laughable to think AMD should automatically benefit from other's efforts without any of their own. Is computer hardware the only industry with such a misguided sense of entitlement?

    Perhaps AMD should do a better job of allocating resources and "Get In the Game" as their own program title suggests. They do have similar efforts, the only problem is they're busy implementing features no one cares about in relatively obscure titles. DX10.1 and DX11 in Battleforge, HAWX, Stormrise, STALKER..... Dirt 2 is a good title but still relatively obscure.
     
Tags: Add Tags

Share This Page