1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News EA and 2K Games license Nvidia PhysX technology

Discussion in 'Article Discussion' started by Tim S, 8 Dec 2008.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
  2. ChaosDefinesOrder

    ChaosDefinesOrder Vapourmodder

    Joined:
    6 Feb 2008
    Posts:
    712
    Likes Received:
    8
    This would be why I bought me a GTX 280 instead of a 4870X2 recently for my new PC. AMD have a loooooong way to go to catch up on nVidia's CUDA
     
  3. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,994
    Likes Received:
    714
    and nVidia's marketing ;)
     
  4. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Yay, more games with fancy physics effects and no gameplay.....
     
  5. Cobalt

    Cobalt What's a Dremel?

    Joined:
    24 Feb 2006
    Posts:
    309
    Likes Received:
    2
    Its like as soon as people begin to catch onto the fact that shiny graphics are covering up bad games, publishers rush to find a new bone to throw to the masses in place of quality.

    EDIT: complete sentences help sometimes
     
  6. wharrad

    wharrad Minimodder

    Joined:
    26 Jul 2003
    Posts:
    870
    Likes Received:
    0
    I was under the impression that DX11 did it all 'out of the box' on DX11 cards.

    I know it's very early to talk about this, but if that's the case, CUDA and PhysicX will die... As I doubt a developer - given the choice - would pick just Nvidia, when DX11 allows physics calculations to be done on both (or should I say all!).
     
  7. -EVRE-

    -EVRE- What's a Dremel?

    Joined:
    12 May 2004
    Posts:
    372
    Likes Received:
    1
    DX11 and AMD's current processor at the time is whats going to get my money... If what Wharrad said is true, I have a very good reason for waiting.
     
  8. TomH

    TomH BELTALOWDA!

    Joined:
    28 Nov 2002
    Posts:
    837
    Likes Received:
    45
    Interestingly enough, you all forget that Bit themselves reported on Nvidia helping the community hackers with their attempts to run PhysX on ATI cards. :)

    However, I do agree that between DirectX and OpenCL, we aren't going to see much support for CUDA in the future. ATI have made their bed, and Nvidia will need to support DX11 anyway. As nice as CUDA is, there will need to be hard decisions made on the use of precious transistors at some point in next few years. I just hope Nvidia see it before it's too late... Because when Larrabee hits, we'll probably end-up with ATI and Intel on one side and Nvidia fighting a losing war.

    Just keeps it interesting though. :naughty:
     
  9. Lepermessiah

    Lepermessiah What's a Dremel?

    Joined:
    1 Feb 2008
    Posts:
    566
    Likes Received:
    1
    LOL, there is what 1 game that supports this? By the time Physics catches on (If it does at all) we won't be using these cards.
     
  10. Mentai

    Mentai What's a Dremel?

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    Mirrors Edge is out in Jan for PC, I assume he'll still be using that card then, and for the other EA/2k games that come out in 09...
    Even as an Nvidia user, I'm really unhappy that adoption of PhysX in games seems to be ramping up. With AMD leaking billions every quarter they need all the advantages they can get, and I assume there will be hefty licensing fees if AMD ever does choose to use PhysX :(
     
  11. Lepermessiah

    Lepermessiah What's a Dremel?

    Joined:
    1 Feb 2008
    Posts:
    566
    Likes Received:
    1
    What game sin 09? Not one is announced, it will take a long time before anymore then a select few games support this, by them any seriosu gamer would have moved on.
     
  12. dyzophoria

    dyzophoria Minimodder

    Joined:
    3 May 2004
    Posts:
    393
    Likes Received:
    1

    +1

    I think AMD will license PhysX if they see the need for it, even if these two companies licensed PhysX tech, we can't assume that all their next games will use the tech, I won't be surprised if only games released in the next 1 or 2 years just begin to use this technology
     
  13. Kúsař

    Kúsař regular bit-tech reader

    Joined:
    23 Apr 2008
    Posts:
    317
    Likes Received:
    4
    Clever move from nVidia - impress publishers instead of developers. They'll force it into their games no matter how useless it might be.
     
  14. p3n

    p3n What's a Dremel?

    Joined:
    31 Jan 2002
    Posts:
    778
    Likes Received:
    1
    Go buy garry's mod off steam and stop being a whiney little *beep*
     
  15. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    CUDA isn't going anywhere any time soon - I think you guys will be surprised how similar DX11 Compute and OpenCL are to CUDA. All Nvidia hardware since G80 already supports OpenCL and I've got good reason to believe that at least the current generation also supports DX11 Compute.

    Although Nvidia skirts around the question, I believe PhysX will be ported to either DX11 Compute or OpenCL. It needs to if Nvidia wants to continue controlling the game physics middleware market (why else would it have bought Ageia?) because with compute APIs coming thick and fast, a developer can create their own 'accelerated' physics engine using the Compute Shader in DX11.
     
  16. [USRF]Obiwan

    [USRF]Obiwan What's a Dremel?

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    So Cuda isn't going anywhere? What about this then. Looks like it's going everywhere. And do not forget some video editing, cad/cam and renderer developers are working on Cuda powered versions. adding to that, the Physics engine Nvidia got a nice deck of cards.

    I think it is wrong to think Cuda is not going anywhere. Take for example TMPGEnc 4.0 Xpress. With Cuda it goes 300 times faster! This saves enormous amount of time. I use it to convert the AVCHD cam recorders to a more editable format for my videoeditor. Now all i need is the videoeditor to go the Cuda way and i'm happy like a bird because it saves me to buy a expensive quadcore.
     
    Last edited: 9 Dec 2008
  17. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    CUDA isn't going anywhere as in it is not going to disappear. It's still going from strength to strength and developer adoption is increasing. Apologies for any confusion - I was merely answering the critics saying that CUDA would die because of OpenCL/DirectX 11 Compute. It won't.

    However, I wouldn't be surprised if PhysX moved to DirectX 11 Compute because I am pretty certain that's where Havok will probably go once Larrabee has launched. At that point, PhysX would become an also-ran because one runs on every DX11 compliant hardware, the other only runs on GeForce. Publishers care about installed bases, so which would you choose as a publisher?
     
  18. Saivert

    Saivert Minimodder

    Joined:
    26 Mar 2005
    Posts:
    390
    Likes Received:
    1
    big question is if OpenCL and DX11 Compute can offer the same features as direct CUDA. CUDA has in fact been developed by NVIDIA for use with their own line of GPUs. No way a GENERIC API can offer the same features. It has to support a common subset only.
     
  19. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    GT200 supports newer features in CUDA than G80 (think Double Precision), so yes, current and future APIs could in fact support many of CUDA's features... the thing with CUDA is that Nvidia will just release a compiler specific to API X, Y, Z, which translates the new code into something its GPUs understand.
     
  20. Bladestorm

    Bladestorm What's a Dremel?

    Joined:
    14 Dec 2005
    Posts:
    698
    Likes Received:
    0
    When Nvidia had a couple big companies demoing impressive gains using Physx/cuda (not ones that most consumers would notice, granted, but impressive nonetheless) and all AMD had on how theres would work was "no comment, come back later" I kinda figured Nvidia had it in the bag.
     
Tags: Add Tags

Share This Page