1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia responds to AMD’s PhysX criticisms

Discussion in 'Article Discussion' started by Sifter3000, 21 Jan 2010.

  1. mrbens

    mrbens New Member

    Joined:
    15 Aug 2009
    Posts:
    511
    Likes Received:
    4
    NVIDIA sure are getting a bad reputation these days! And it's about time they got some new cards out to give ATi some healthy competition.
     
  2. Baron1234

    Baron1234 New Member

    Joined:
    21 Jan 2010
    Posts:
    3
    Likes Received:
    0
    Well after the Intel compiler scandal, I really don’t know, I have to believe what AMD says. I feel sorry for AMD, it is the smaller company and everyone tries to kill it with dirty tricks.
     
  3. chizow

    chizow New Member

    Joined:
    12 Dec 2008
    Posts:
    24
    Likes Received:
    1
    Yes the CPU bench uses the GPU as well as a complement and when it does, you can see it accelerates performance but overall performance is still not acceptable, it goes from single digits to double digits.

    There is no artificial inflation because again, you can run the advanced PhysX effects meant for the GPU on the CPU by simply disabling PhysX in the NVCP or by running an ATI card. You can do this in Mirror's Edge, Cryostasis, even Batman:AA and when you do, you can see very clearly that the CPU is not capable of adequately accelerating these effects even with all 4 cores utilized.

    It simply comes down to computational power and the great irony here of course is that AMD can't tell you fast enough about the amazing computational power of their GPUs (we've been hearing 1 TFLOP for how long now?) when its convenient for them, but when its not, somehow a CPU with less than 1/10th the floating point capabilities is suddenly good enough?

    Also, as for why game physics don't make better use of the CPU, it should be obvious, the game engines running on the CPU handle more than just PhysX, you have the main rendering thread along with AI, control input, sound, etc. along with PhysX, so obviously you can't put those aside just to dedicate all cycles to PhysX, or the game would run even worst. Given most gaming PCs are still dual core, developers have to budget for weaker CPUs but as you can see with console ports that have more baseline CPU power at their disposal , this is beginning to change.
     
  4. chizow

    chizow New Member

    Joined:
    12 Dec 2008
    Posts:
    24
    Likes Received:
    1
    Which is why I strive to be neither, as you can see I try and inform myself on the topic so as to avoid commenting ignorantly.....enjoy your radeon!
     
  5. thehippoz

    thehippoz New Member

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174


     
  6. Pete J

    Pete J Employed scum

    Joined:
    28 Sep 2009
    Posts:
    6,138
    Likes Received:
    763
    Hmm, that video confused me. So ATI can run Havok physics on their cards? Why don't they implement it then to counter Nvidia's PhysX?

    By the way, I don't have ATI cards, they're GTX 260s :) .
     
  7. Initialised

    Initialised New Member

    Joined:
    23 Jul 2009
    Posts:
    73
    Likes Received:
    1
    If you are lucky enough to have a 8800 GTX that hasn't had to be re-cured in the oven or RMA'd nVidia wont let you use it along side your shiney new 5970.

    ATi Graphics with Aegia/nVidia PhysX can be done, but it wont work using nVidia drivers after 191.07. Since the 'Cake' crack surfaced and AMD's first attack on the issue nVidia have taken further action to prevent their customer base from getting a year or two extra out of the last gen cards they paid hundreds of pounds for. Presumably the next step is for new TWIMTBP games to require drivers later than this to run PhysX.

    Basically they are telling their existing customer base that they don't support them any more in order the hope that they will wait for Fermi rather than getting a Radeon.
     
  8. impar

    impar Well-Known Member

    Joined:
    24 Nov 2006
    Posts:
    3,106
    Likes Received:
    41
    Greetings!
    Yep.
     
  9. chizow

    chizow New Member

    Joined:
    12 Dec 2008
    Posts:
    24
    Likes Received:
    1
    And here's the end result of such hacks/workarounds:

    http://www.youtube.com/watch?v=AUOr4cFWY-s#t=1m10s

    It should also be noted that the workaround requires you to turn down the physics fidelity/calculations in order to run even remotely adequately on a CPU which results in the very distinct collision detection and simulation problems with papers/fog.

    You can also run the GPU effects on the CPU in some other titles like Cryostasis and Mirror's Edge where it becomes abundantly clear even today's fastest Quad core CPUs simply cannot adequately accelerate more advanced physics effects. If they could've, they would've.
     
  10. chizow

    chizow New Member

    Joined:
    12 Dec 2008
    Posts:
    24
    Likes Received:
    1
    And that's a great question, maybe one for ATI? Maybe ATI should worry less about what their competition is doing to add value to their own hardware and instead, look to implement the features they promised and claimed support nearly a year ago for their own customers?

    Since you already own GTX 260s you won't have to worry, Nvidia has already stated they're fully open to supporting any and all physics middleware and API on your Nvidia hardware. ;)

    http://www.bit-tech.net/news/hardware/2009/10/14/nvidia-doesn-t-care-what-physics-library-de/1
     
  11. l3v1ck

    l3v1ck Fueling the world, one oil well at a time.

    Joined:
    23 Apr 2009
    Posts:
    12,956
    Likes Received:
    17
    +1 until Nvidia explain themselves. If it wasn't true I'd have expected them to deny it by now.
    I imagine the EU will be on them like a tonne of bricks if it's true.
     
  12. Krayzie_B.o.n.e.

    Krayzie_B.o.n.e. New Member

    Joined:
    2 Aug 2009
    Posts:
    427
    Likes Received:
    6
    Physx has been around for like 10 years and this is all we got?

    I believe ATi/AMD because basically Nvidia hasn't really pushed the technology and if it is technology worth pushing make it accessible to everyone in the best way.

    I only see PHYSX as a way for NVIDIA to sell cards.

    Locking out ATI cards proves the point that NVIDIA is just looking to make a buck
     
  13. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    You know what I find quite funny?

    Even when Ageia first came out with their PhysX cards, I distinctly remember someone posting up ini modifications for making the CPU run it.

    If I remember correctly, the CPU ran it much better than the card did..
     
  14. Saivert

    Saivert Member

    Joined:
    26 Mar 2005
    Posts:
    390
    Likes Received:
    1
    People believe what they want to believe. Facts are irrelevant.
     
  15. Rezident

    Rezident GFWL failed

    Joined:
    23 Sep 2009
    Posts:
    22
    Likes Received:
    0
    Same as. Despite being an nvidia customer for years because of their usually excellent hardware, their nasty business practices over the last few years have put me right off them. I honestly couldn't support (or even believe them) anymore. I'm afraid their dominant market power has corrupted them.
     
  16. Snuffles

    Snuffles Dreamy Mammoth

    Joined:
    24 Feb 2009
    Posts:
    59
    Likes Received:
    2
    The attitudes and business practices of both nVidia and Intel have pretty much sealed the deal on my next computer being AMD/ATI. Ill take the hit on performance over having to deal with the Electronics Mafioso
     
  17. hrtz_Junkie

    hrtz_Junkie Controversial by Nature

    Joined:
    25 Jul 2009
    Posts:
    30
    Likes Received:
    0
    Omg lets all jump on the nvidea bashing bandwagon?????

    I am really starting to hate all this "nvidybashing" Nvidea is a company who (just like all the other's) is trying to servive in a harsh and cut-throught business.

    Also nvidea does an awfull lot that we as gamer's take forgranted!! take for example epic's lack off aa support for the unreal engine (obviosly a payoff from intel to make there xbox 360 graphics still seem relevant). So what do njvidea do? they code there own aa solution and implement it for free to anyone who owns an nvidea card????

    Obviosly there not going to do it for ati's hardwhere and why the F$$$ should they??? they did the work??

    Its the same with phys x they developed/paid for the technology. why in gods name would they then port it to there competition? thats finantial suicide?????

    Again what about gpgpu? nvidea does all the hard work developing it then ati suddenly realises its missing and bolt's on stream wich is inferior in every way to cuda, but no one even seems to care???????

    If you want conspirecy's, try this one on for size,,,,

    Intel Is worried about nvidea atm, they do not have anything to compete with fermi/gt200 in terms off it's tesla "blade" sytems. Nvidea are offerring to build supercomputer's at 10 times the performance at 100th off the cost (of say using I7)

    Seeing as Buisnes use compromises the majority of intels customers there are right to by worried. and knowing intel as we all do I woudn't put it passed them pre emptively attaking nvidea by sideing with ati and pumping out a load of anti nvidea propaganda!!

    They allready support ati by using the there gpu's in the 360, they also know that amd cant touch them in the cpu department so there no big threat at the moment,

    Maybe the've said to ati keep to vga and dont go stepping on our toes in the high performance parralell architecture and whe'll help you commbat nvidea???

    I know this is quite a leep of faith but it's only an idea ???? will be interesting to hear what you'll think!!!

    and to all you fermi basher's Just wait till it's relleased or risk looking foolish!! dont say i didn't warn you !!
     
  18. barndoor101

    barndoor101 Bring back the demote thread!

    Joined:
    25 Oct 2009
    Posts:
    1,694
    Likes Received:
    110
    you should get that fanboyism checked out you know, it might get you in trouble (its already fooked up your spelling in any case).

    OK firstly, why in gods name would intel make a payoff for the xbox 360? they dont have ANY part of the xbox 360. IBM make the CPU, ATI the gfx. at least get your facts right.

    nvidia have locked physx and cuda to their own hardware, which isnt reprehensible normally, but when they actively dont allow you to run (for example) an ATI card as the renderer with an nvidia card as a physx accelerator then that is anti-competitive.

    intel dont make the CPU in the xbox 360, and why the hell would intel side with ATI - ATI are owned by AMD who are their rivals in the CPU space.

    you talk about tesla being a supercomputer - not really. they are very good at certain types of calculation, but you still need CPUs to do the general purpose stuff.

    come back when you have filled the holes in your theory, and also learn how to spell.
     
    Sloth likes this.
Tags: Add Tags

Share This Page