News AMD says PhysX will die

Discussion in 'Article Discussion' started by Tim S, 11 Dec 2008.

  1. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
  2. pimlicosound

    pimlicosound New Member

    Joined:
    7 Sep 2007
    Posts:
    242
    Likes Received:
    8
    On the one hand, it's sort of his job to not congratulate nVidia for the work they've done with PhysX, so one could look at it cynically, but he does have a good point - the implementations of PhysX we've seen so far aren't game-changing. And I can't see the gaming world being revolutionised by such a closed, proprietary technology, that only some developers / hardware manufacturers / gamers will have access to.
     
  3. Mentai

    Mentai New Member

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    Yes he has a point, and yes it is only eye candy at this stage, but when you are in the market of creating eye candy, surely it holds some importance. To me the cloth and glass effects looked really good in the new Mirrors Edge trailer, and I'm glad I'll be able to see them, whereas my friends with newer more powerful rigs (ATI) aren't too happy that I can achieve higher graphical settings because of my hardware vendor.
    I myself am pleased to see something being done until dx11 gets here. Physics like this has been a long time coming, the fact that the green team is getting a headstart is just how business goes. They definitely made a very good move by getting publishers on their side, even if it may be a temporary investment.
     
  4. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Mentai, I agree that the effects look cool in Mirror's Edge - there's no denying that - but there's also no denying that the effects don't actually add much to the experience. I still look back at Cell Factor (yes, rubbish game I know) as the holy grail for game physics - where the physics is actually integral to the gameplay and to make things look more realistic... not just to make things look prettier!

    Nvidia is heading in the right direction with PhysX (studio adoption), but without widespread support across other vendors' GPUs/parallel processors - yes, it runs on the CPU, but very slowly - it's not going to take off in a big way.
     
  5. sui_winbolo

    sui_winbolo Giraffe_City

    Joined:
    25 Sep 2004
    Posts:
    1,539
    Likes Received:
    23
    I completely agree that it will die if it remains closed.

    What game developer wants to make games that only work with Nvidia and not ATI?

    I can pretty much guarantee that no company developing a game would do this for the PC platform. Even if you split up the market say 50/50 customers own either Nvidia or ATI. There's 50% of the PC population not being able to play a game with awesome amazing physic effects because they own ATI.

    I know my statement doesn't really hold water because there's so many factors I'm not considering, but my point is a developer is less likely to make a game that only works with PhysX because it's target population would be rather small. Sure their game might support it, but it won't be to the grand extent it could be. It'll be cool glass and wind effects. Not a game built completely on it.

    However, what if ATI and Nvidia both had PhysX? What if that became a standard for PC gaming? That would be pretty damn cool to see the games churned out for the PC.
     
  6. Mankz

    Mankz 5318008

    Joined:
    15 Jan 2006
    Posts:
    14,494
    Likes Received:
    472
    I thought that PhysX was dead before it even began?
     
  7. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    AMD will die as a company and go under before PhysX ever dies... I give AMD two years before they are liquidated...

    Jen Hsun Huang will get the last laugh... Sad but True ;)
     
  8. teamtd11

    teamtd11 *Custom User Title*

    Joined:
    31 Aug 2005
    Posts:
    2,268
    Likes Received:
    30
    I would of thought as cpu's get more and more powerful. that this will easily be done with spare cpu time.
     
  9. n3mo

    n3mo New Member

    Joined:
    15 Oct 2007
    Posts:
    184
    Likes Received:
    1
    nVidia likes to think that everyone loves them and is wiling to do as they wish (it's like a smaller version of Intel), but sadly it is not like that. I also think that if it remains as a technology available only for nVidia it is doomed do fail.

    Ideally we would see both Havok and PhysiX available on nV and ATI GPUs, this would be the best option for everyone.

    Wat? AMD won't die in your lifetime. Yeah, they might not be so popular nowadays after not-so-good Phenoms, but Phenom II might change it... and remember that AMD is far more popular when high performance computing is considered - 7 out of ten top supercomputers work on AMD chips, and ATI gave them a huge financial boost. And anyway, pray that AMD does good, because the day after AMD dies you will pay 1000 pounds for a Celeron, and for Intel quad-core you will have to sell your children.

    Now, having a multicore GPU with PhysiX running on one or two cores would be sweet.
     
    Last edited: 11 Dec 2008
  10. bowman

    bowman New Member

    Joined:
    7 Apr 2008
    Posts:
    362
    Likes Received:
    6
    Now that OpenCL and/or DX11 compute shaders will be ubiquitous noone cares what API runs the physics as long as it's there.
     
  11. drakanious

    drakanious New Member

    Joined:
    12 Apr 2007
    Posts:
    24
    Likes Received:
    0
    I agree with this, mostly, but there is no implementation of physics [or anything at the moment] on OpenCL nor DX11 computer shaders. What I want to see is physics APIs [or even better, entire game engines] coded to take advantage of OpenCL and DX11, if present.
     
  12. UrbanMarine

    UrbanMarine Government Prostitute

    Joined:
    7 Aug 2008
    Posts:
    1,135
    Likes Received:
    19
    Don't you have to buy a seperate card to used Physix anyways? I never really looked into it because like Mankz thought, it died before it began.
     
  13. B1GBUD

    B1GBUD More Biddy Bang Bang than Sean Paul

    Joined:
    29 May 2008
    Posts:
    2,998
    Likes Received:
    289
    You can still run PhysX on an ATI shod pc with an Ageia PPU PCI-e / PCI card, nVidia bought Ageia (or the license to develop) and released GPU accelerator support in drivers (180.xx I believe)

    But I agree with most that it won't get many votes if it stays closed.
     
  14. wuyanxu

    wuyanxu still wants Homeworld 3

    Joined:
    15 Aug 2007
    Posts:
    10,536
    Likes Received:
    223
    bring us working Hovak physics on your ATI cards before prediction competitor's doom.

    this is just bad cooperate practice.
     
  15. ChrisRay

    ChrisRay SLIZONE *****istrator

    Joined:
    11 Dec 2008
    Posts:
    9
    Likes Received:
    0
    Hey Tim. This is Chris. ((We know each other from editors day and have shared a drink or two)). But anyway. I have to admit I am very dubious of this interview. A couple of things I'd like to point out.

    If AMD was serious about supporting PhysX and had approached Nvidia about it. Nvidia by all accounts I've heard would have worked with them to get it supported. The exact same thing can be said about Cuda. It should be pointed out that OpenCL is basically based off CUDA and DX11 is following very similar path. AMD would like you to believe that Nvidia is holding them back from support for things like that. Nvidia is focused on the GPU. While AMD is still a CPU company. And is riding the tail of Intel and supporting their far more dangerous competitor.

    When larrabee actually ships and Intel is doing GPU computing and GPU physics too. We will see how things hit.


    This is a just sidestep. Of course PhysX wont be on every single title that comes from Electronic Arts. But now EA has the base tools to implement at they see fit. As we are both aware. Nvidia TWIMP campaign has been very successful at getting features and technology implemented. The fact that Nvidia is promoting this shouldnt be a surprise to anyone.

    What does that even mean?

    What do you buy a GPU or hardware for if not for more eye candy and gameplay? We could be playing Crysis with the Quake 3 engine and get the exact same gameplay experience couldnt we? Physics as you know are just ways to enhance the visual and gaming experience. Visuals alone dont make a game. But they certainly do help. Especially in this industry.


    Tim. I know you didnt make these responses yourself. But I honestly think you should at least contact Nvidia get there counter marketing arguments to these claims. To keep the article balanced.

    Cheers
    Chris

    SLIZONE Administrator
    Nvidia User Group Member
     
    Last edited: 11 Dec 2008
  16. HourBeforeDawn

    HourBeforeDawn a.k.a KazeModz

    Joined:
    26 Oct 2006
    Posts:
    2,637
    Likes Received:
    6
    I have to agree as well as PhysX and the route its taking will die and besides its not even that great of a platform to begin with compared to Havoc and other built in physic engines like those scene on steam.
     
  17. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Hey Chris,

    I am merely reporting AMD's response to the announcement, as I said I would during my initial piece on the PhysX news from EA and 2K. I asked Nvidia to comment, but that offer was declined.

    I am as dubious as the next about AMD's plans to support Havok on the GPU. It was promised for the end of this year but it hasn't happened - I believe Intel is holding it back for Larrabee, as stated in the article. Only then will we know if AMD was true to its word about supporting Havok on the GPU. Right now, there is nothing at all and we're just hearing buzzwords like OpenCL and DirectX 11 Compute as the saviours for AMD - that's exactly what I said at the end of this article.

    It is a sidestep - it was a question that popped up following Godfrey's decision to not-comment-but-comment on Nvidia's push with PhysX. But I am merely reporting what he said.

    IMO, it's clear that they're not ready to talk and they're waiting for Intel to release its milestone (i.e. Larrabee). Sub text: AMD is not sure how to react to this yet. I do know that there has been at least one face to face meeting between AMD and Electronic Arts since the announcement, but I do not know the outcome of said meeting.

    The holy grail of physics, and the way I want to see it implemented into games, is like we saw in Cell Factor... where the physics is actually part of the gameplay. However, out at Nvision, I sat down in several very interesting discussions with a number of top developers, physics middleware companies (many who are supported by TWIMTBP) and engineers (including several members of Nvidia's DevRel team) and they all agreed with me: until whatever physics engine is used is supported by every major hardware vendor, we will see nothing more than effects physics.

    The reason? It's because there has to be a lowest common denominator in every system that the game is being played on. That is, sadly, not an Nvidia GPU. It is the CPU. And the developer can't break gameplay on systems that don't have an Nvidia GPU - it has to work when one is not present - falling back onto the CPU. Unfortunately for PhysX, while it runs on the CPU thanks to some great work by Nvidia's engineers who ported CUDA to the CPU, it is too slow for it to be usable in a game from the developer's perspective. Many of these developers are part of TWIMTBP, but they spoke freely and openly with me about the problems they face with GPU accelerated physics - if it is only available through CUDA, PhysX will not take off in the way it deserves to.

    Please don't get me wrong, I am not slamming CUDA here because I think it has done great things for the GPU computing industry - without it, we wouldn't be on the verge of a revolution. I write about GPUs almost exclusively these days, so having more stuff to write about is fantastic - I have Nvidia to thank for that following the introduction of CUDA. Both OpenCL and DirectX 11 Compute are very similar to CUDA in many ways and there's good reason for that - it's because CUDA did the right thing. Its problem, though, is not that it isn't a fantastic piece of technology, it's because the software industry relies on cross platform compatibility.

    Tim
     
  18. ChrisRay

    ChrisRay SLIZONE *****istrator

    Joined:
    11 Dec 2008
    Posts:
    9
    Likes Received:
    0
    Tim first. Please dont feel I am selectively quoting. Just highlighting topics I find interesting.

    Interesting. I'll ask them about it.


    I agree with this. The problem I see for AMD is they typically play follow the leader in this industry. And they tend to make alot of claims without actual substance. I'll be interested to see when intel releases Larrabee and I do strongly believe their GPU physics will be around the corner for them. And if AMD isnt onboard they're going to look foolish for comments like this.


    In this paticularly respect. I do not believe its Nvidia's fault. CUDA for instance is not as "Nvidia only". IE its more to the fact that its just simply compatible with Nvidia's architecture. I think its fairly clear that the current AMD architecture is not suited for CUDA. But that doesnt mean AMD couldnt make it work. And that Nvidia would go out of their way to stop them.


    I also look forward to DX11 and OpenCL. And I do think they are gonna be truly revealing about who has taken the " Big Steps" with GPU Computing in the first place. But other than Havok there's no real API out there for Physics other than PhysX.

    I just have a hard time beleving AMD was just sideswipped and surprised by this announcement. They have known Nvidia is supporting and promoting physX in a big way.


    I guess this is just a differing in opinion's of ours. And I dont know if this the way Nvidia feels. But graphic effects, and the way they interact with the scene is immersion to me. When the GTX 280 first released. I wasnt nearly as impressed with it as I was PhysX. Seeing Glass shattering. Space men flying off the pattern was all "cool" to me. And much more interesting than the actual GT200 hardware was to me at the time. The one thing that really is important to me is. PhysX is basically free to any Nvidia user with DX10 + hardware. Theres no detriment to Nvidia users for having support for it and I only see positives. Even if its just shiny little effects like seen in Mirrors Edge. The other option is simply disable them.

    In the long run this is to me. Just the same as turning off high quality shaders or other quality enhancements. The better the game looks and feels the more fun it can be to play. And I do agree that Nvidia's CPU CUDA compiler is relatively weak atm. There GPU one is soaring. And in a years time they have made some tremendous strides. 10 months after the purchase of AEGIA and the progress made has been amazing in my eyes.

    I somewhat understand what you are saying about Physics and gameplay and how it actually "effects" gameplay. But I think we're a long ways away from that currently. And havok hasnt really been delivering it either.

    Regards
    Chris

    SLIZONE Administrator
    Nvidia User Group Member


    *Edit* I made a change to the timing from when they bought AEGIA. For some reason thought it back in September 2007. So its even faster than I remember.
     
    Last edited: 12 Dec 2008
  19. B3CK

    B3CK New Member

    Joined:
    14 Jun 2004
    Posts:
    402
    Likes Received:
    3
    I never read up on dx(version) very in depth. I was under the impression it was to help allow MS(windows), to provide a foundation that all visual/audio hardware could use so that the disparity between hardware vendors wouldn't cripple games because of ones installed hardware choices.
    I think amd/ati are either trying to hold announcements of products until their version is at a more finished state with a launch date in site. Although this is quite questionable as if they were sure it would be done in less than a year, they wouldn't reply with a marketing jab, more of a counter "Look what we got".
    Or Amd/Ati is having milestone issues with getting vendors on-board.

    My opinion is that physX isn't dead. And that even realistic cloth or glass is just another feature that in 4 yrs will be a de-facto standard in games. Like how much lighting and shadows affect our experience in game. Try playing a game that has great shadows or realistic lighting effects for an hour, then replay that with out those effects on.. Makes a huge difference into bringing people into the game.
    Actual bullet travel in FPS games that can be predicted by watching grass/foilage direction is a great way of bringing this into game dynamics when long range 'sniper' shots are being taken. or bullet trajectory drop over distance.
    Or take explosions; when things blow up in games ,, it seems to me that each object thrown or moved by the explosion is coded to that particular explosion by distance and per object. With physX you should be able to apply size and weight to an object and allow the physx to take control of how far, what angle thsoe objects get pushed, instead of coding each object through tweaking and experimenting. Should allow the dev's to code many more objects in the same amount of time then testing and tweaking each object induvidually.

    As I havn't had time to read on all these technologies after reading this, my experience/knowledge of this subject may be bent by marketing, so go easy on me when reading this.
     
  20. Tim S

    Tim S Well-Known Member

    Joined:
    8 Nov 2001
    Posts:
    18,879
    Likes Received:
    76
    Chris, don't worry about selective quotes - it's fine. :)

    Nobody is delivering gameplay physics at the moment and they won't for a while. Part of that is Nvidia's fault for keeping PhysX exclusive to Nvidia hardware (I understand why it is kept exclusive in the short term, but not in the long term and I doubt it will remain exclusive to CUDA in the long term) and some is AMD's for not adopting it. Stream is actually very similar - but not the same - as CUDA on RV770 and it wouldn't take that much effort to get it running IMO, but it's a matter of principle. The problem for developers though is that Stream and CUDA aren't the same and so they find themselves having to write two pieces of code to solve one problem. We're back in the days of SM2.0 and SM3.0 all over again and I thought we'd just fixed all that malarky with DirectX 10.

    With all of that said, it doesn't mean gameplay physics shouldn't be a holy grail for the industry to aspire to and it's something I want to see because I think, if done properly, it could introduce some new gameplay ideas to a slew of bland titles. At the same time though, it could become a gimmick - I think the same could happen to effects physics because if every developer uses exactly the same effects in their games, the effects are no longer cool - they just become bland and uninspired in my opinion. Really speaking, AMD and Nvidia need to bang their heads together and think this through because the number of top PC games this year isn't as comprehensive as it should be. I think that's related to the uncertainty surrounding things like this in part and of course the uncertainty around piracy is another issue that needs to be resolved (I didn't want to bring it up in this thread, but it's a big factor in publishers' decision making).

    Getting back onto topic though, Cell Factor wasn't a good game but it was a good demo because it showed what was possible with a relatively simple piece of silicon like the PhysX PPU - it should therefore be possible on the GPU if the industry can work itself out and align behind one physics API. AI event branching is something else I really want to see in games, but that doesn't mean it's going to happen yet. Unlike Physics though, that needs truck loads more computational power, but the good thing is that the industry appears to be thinking about more than just pretty pixels right now. I think that's a good thing because finally we're going to go beyond photo realism and actually thinking about other parts of the environment, which will enable the industry to move towards those 'cinematic' gaming experiences it has been promising for a long time. :)
     
Tags: Add Tags

Share This Page