Graphics nVidia's pants are lowered! PhysX-posed...

Discussion in 'Hardware' started by fingerbob69, 7 Jul 2010.

  1. fingerbob69

    fingerbob69 Minimodder

    Joined:
    5 Jul 2009
    Posts:
    801
    Likes Received:
    16
    Charlie gives his usual Nvidia friendly summation here:

    http://www.semiaccurate.com/2010/07/07/nvidia-purposefully-hobbles-physx-cpu/?c=684

    While the original article is not entirely a morass of techno treacle here:

    http://www.realworldtech.com/page.cfm?ArticleID=RWT070510142143

    Personally I'd like the Bit-tech staff to do an analysis of both articles and give a pronouncement on what Nvidia have been/are upto. Why? Because in my opinion, such hobbling of code and hardware is anti the spirit of a site such as Bit-tech. Bit-tech may reach a different conclusion. Either way, I for one would like to know what Bit-tech think.
     
  2. chrisb2e9

    chrisb2e9 Dont do that...

    Joined:
    18 Jun 2007
    Posts:
    4,061
    Likes Received:
    46
    I'd like to see bit take a stab at that as well. There appeared to be some bias in the article by semiaccurate.
     
  3. shoxicwaste

    shoxicwaste Minimodder

    Joined:
    2 Jul 2010
    Posts:
    248
    Likes Received:
    9
    Mhmm, Physx is complicated
     
  4. Jipa

    Jipa Avoiding the "I guess.." since 2004

    Joined:
    5 Feb 2004
    Posts:
    6,363
    Likes Received:
    125
    Thank god the whole Physx-malarkey is just a marketing trick to begin with. Otherwise I might actually be upset by just how low Nvidia can go with their bs.
     
  5. Rofl_Waffle

    Rofl_Waffle What's a Dremel?

    Joined:
    24 Mar 2010
    Posts:
    504
    Likes Received:
    12
    They can't market a product if they made physx work better on a CPU. A quad core has extra cores sitting there while the GPU is usually crunching graphics numbers. If the CPU got optimized and ran off a better instruction set, it would destroy the Nvidia GPUs.

    Not that GPU based physx is particularly fast anyway, actually it takes a lot out of frame rates already.
     
  6. Bloody_Pete

    Bloody_Pete Technophile

    Joined:
    11 Aug 2008
    Posts:
    7,916
    Likes Received:
    724
    PhysX is just a pain, the number of game I've installed and its trown a paddy and I've had to hunt for a fix... I find Havok works a lot better with the same results...
     
  7. Guest-16

    Guest-16 Guest

    I will read! Real world tech is usually a fantastic resource for the nitty gritty :thumb: Just a shame they dont do more articles
     
  8. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,577
    Likes Received:
    196
    An interesting article, but they do have a semi-legitimate reason, which still regardless is terrible for the public eye anyhow.
     
  9. Rofl_Waffle

    Rofl_Waffle What's a Dremel?

    Joined:
    24 Mar 2010
    Posts:
    504
    Likes Received:
    12
    At the end of the day, they still have to make money.
     
    Last edited: 8 Jul 2010
  10. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    Charlie...

    I'm by no means an nVidia fanboy, but even I fail to see why nVidia should devote time and effort to making software that they wholly own (and use as a selling arguement for their hardware) run better on a competitor's product.

    This is like getting mad at that "will it blend" douchebag. It's his own iPhone, and if he wants to blend it up, that's his own choice to make, even if other people could have better use for it.

    From a technical standpoint it's interesting to see that PhysX could run much better on the CPU, but nVidia doesn't have CPUs, they sell toasters GPUs, so they make it work better on waffle irons GPUs.
     
  11. gavomatic57

    gavomatic57 Minimodder

    Joined:
    23 Apr 2009
    Posts:
    5,091
    Likes Received:
    10
    +1
     
  12. Guest-16

    Guest-16 Guest

    Why I see your point, there are levels of "being a dick" about it. Just like Intel's "Genuine Intel" trick in its own compiler.

    For starters - if you want to make the best product for your customer base, and promote the fact your product is cross platform, it's just dick-ish to not optimise your product for every platform - after all it has to compete with other products that ARE optimised. They put all that effort into their drivers making games fast, why not PhysX dlls?

    Secondly - the tools to use SSE etc are out there already, so is CPU support. You'd have to purposely avoid them not to compile using them. That's being a dick.

    Would you now buy a game with PhysX, or DEVELOP a game using PhysX knowing you'll get purposely shitty performance on a CPU (no matter what CPU, since only two x87 instructions can be handled at once), or an expensive performance to buy an extra GPU that barely any of the market has?
     
  13. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    nvidia locking the tech to their own GPUs - i can understand because other GPUs are direct competitors.

    nvidia gimping performance on CPUs to make their GPUs look better - a dick move to let them spread FUD about CPUs in general.
     
  14. isaac12345

    isaac12345 What's a Dremel?

    Joined:
    20 Jul 2008
    Posts:
    427
    Likes Received:
    3
    I will cite one of the comments from the semi-accurate article -
    "Ageia, the father of PhysX was started up in 2002... its a safe bet that the originating partners were working on PhysX quite some time prior to filing there articles of organization for the corporation. This would put the origination time closer to the 1999 x87 SSE changover, giving a possible explanation as to the use of the older Floating point instructions. (Especially since as you remember, SSE wasn't exactly welcomed with open arms upon introduction. It took a couple years for enough SSE supporting chips to flood the market that it was worth it for developers to start using the new instruction set without fear of alienating a large portion of the non-SSE supporting user base.)

    I find the use of x87 to be completely plausible from Ageia's standpoint... back then it was much more familiar and less time consuming to code for than SSE (And time was something Ageia was racing against to get it's first add on cards out the door.. I seriously doubt they had the financial or development resources needed to retool their design for SSE if they wanted to...)

    Now, since nVidia purchased them 2.5 years ago, and allegedly ported their technology over to CUDA, I would go as far as to call them lazy for not updating the instructions used for the task, but then again, I am not an electrical engineer, and have no idea which types of instructions "flow" better through the GPU/CPU pipelines... Maybe nVidia's engineers were being rushed by JHH to get PhysX working on the GPU ASAP, so they cut corners and stuck with x87 to simplify things and keep dear leader happy."

    This does seem to raise a good point but as Bindi said, Nvidia is being dick-ish: morally wrong but business wise. Is it possible that an anti-competitive lawsuit could be raised against Nvidia for this move?
     
  15. lp1988

    lp1988 Minimodder

    Joined:
    24 Jun 2008
    Posts:
    1,288
    Likes Received:
    64
    I don't see how, ATI/AMD, developers and Intel are free to make their own drivers and their own systems that do the same.

    At the very least it doesn't look like any other anti-competitive case I have heard about.
     
  16. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    They do do some fantastic articles. But 'they' are just one bloke who does this in his spare time I believe?
     
  17. Xtrafresh

    Xtrafresh It never hurts to help

    Joined:
    27 Dec 2007
    Posts:
    2,999
    Likes Received:
    100
    I would sooner buy a blended iPhone then anything with PhysX stamped on it :p

    While the feature does have some genuine appeal, it's been run into the ground in true nVidia fashion. They try to make everything a closed platform and erect tollbooths for every little bit of nVidia tech you encounter. This is just not how the PC platform works, and i vote with my wallet by not bothering to let Cuda or SLI comatibility (motherboards), or PhysX, or any of the other nVidia only tech be a factor at all in my purchasing decisions.

    nVidia marketing, if you are reading this... how about actually advertising with your strengths? You have F@H going for you in a big way.
     
  18. Rofl_Waffle

    Rofl_Waffle What's a Dremel?

    Joined:
    24 Mar 2010
    Posts:
    504
    Likes Received:
    12
    I personally don't care because physx isn't actually very good.... at all... i mean like 1 good game has it and thats not even multiplayer.

    In fact no multiplayer game will ever sport it since half the customer base won't be able to run it and that really helps game developers make money.

    On the other hand Havok physics is just as good if not better, runs smoother, and available on all sorts of online games since you can be confident everyone has a CPU.
     
  19. Guest-16

    Guest-16 Guest

    Yes. He's an exceptional analytical writer.
     
  20. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    think posted this before.. but back when ageia owned physx, we were really excited

    remember downloading cell factor and it ran great on the cpu despite all the rave about the ppu being so much faster.. that's when I knew it was a gimmick and it did flop for ageia before the marketing team at nvidia got ahold of it.. all of a sudden it was the holy grail of gaming.. if you spend enough money on spin doctors, you can make eskimos buy ice (or fanboys depart with their money).. forget havoc had been doing it on the cpu for ages already

    I mean batman, played it with physx with the paper flying around- the stacks on the ground were static though and it kind of ruined the effect (in case you never tried it) it wasn't anything you had to have or couldn't be done with havoc for that matter..

    that said batman was a awesome game- and I give props to the devs there but that's really the only game with physx worth talking about.. mirrors edge wasn't anything special either- it just reminded me how far in the back pocket of nvidia these devs were

    I'm glad some devs are still using havoc and haven't been pulled in.. hopefully opencl is the future and we don't have proprietary code used like this anymore
     

Share This Page