1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Nvidia PhysX pack due shortly

Discussion in 'Article Discussion' started by CardJoe, 29 Jul 2008.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,343
    Likes Received:
    292
  2. Timmy_the_tortoise

    Timmy_the_tortoise International Man of Awesome

    Joined:
    28 Feb 2008
    Posts:
    1,039
    Likes Received:
    7
    Let me know when I can get PhysX on my 4850... Then I'll be interested.
     
  3. [USRF]Obiwan

    [USRF]Obiwan New Member

    Joined:
    9 Apr 2003
    Posts:
    1,721
    Likes Received:
    5
    Thats a interesting feature, let your old card do the physics without needing SLI. So when I upgrade to a new card my old 8800GT can act as physics processor. Thats cool.
     
  4. Andune

    Andune New Member

    Joined:
    9 Jan 2007
    Posts:
    30
    Likes Received:
    0
    "unlike other companies *cough*Apple*cough* that would charge for it."
    I have never payed for a Leopard update...

    So far though i'm not terrible exited by PhysX....
     
  5. Guest-16

    Guest-16 Guest

    You paid for Leopard though :p Last time I checked I haven't paid for a Windows service pack ;)
     
  6. Romirez

    Romirez New Member

    Joined:
    31 Jul 2004
    Posts:
    39
    Likes Received:
    0
    So.... My 8800 will do my physics and my graphics in some games? You guys doing any benchmarks for the new nvidia physx by any chance? Might break out advanced warfare (shouldn't that be warfighter btw? :p) again to see if I can spot a difference
     
  7. zabe

    zabe Perfect in my imperfection

    Joined:
    2 Jul 2007
    Posts:
    197
    Likes Received:
    0
    I think it's great as an advancement for the overall industry. Ageia had their chance and they didn't execute their strategy in the most brilliant way, thus they went down. Nvidia saw the opportunity of their parallel processors (aka GPUs) and bought them.

    Now, the interesting bit is that, with nvidia so concentrated on getting Physx to work in all their new GPUs with stream processing, they weren't able to deliver as much performance in the new hardware as ATI (I refuse to call them AMD, they'll always be ATI to me), who had plenty of time to refine their new hardware (although after a very long struggle) and now they bring better performance at a more attractive price.

    So now nvidia has (or will have shortly) great (hopefully) physics and good GPUs, while ati has great GPUs with no phisics. When nvidia overcomes the problems that surely will arise from the new drivers mixing graphics and physics functionality, they'll re-focus on graphics performance. Meanwhile, ati will have to focus on intruducing phisics to their hardware, either via Havock or some other way, before Intel comes into the scene with Larrabee.

    So yup, the market is at an interesting point where, more than in battle, it seems we're gathering new ammo for the upcoming graphics/physics war!! In the meanwhile, I'm just excited to see what my humble geforce 8600M GT can do in my laptop... I'm not expecting anything big as if I had a 9800 x2, but nevertheless, it'll be interesting to see what my core 2 duo can process once leveraging the phyisics calculations to my 8600...

    In any case, it's an exciting moment for all of us gamers!! Graphics + physics = BETTER graphics!! woohoo!!
     
  8. Timmy_the_tortoise

    Timmy_the_tortoise International Man of Awesome

    Joined:
    28 Feb 2008
    Posts:
    1,039
    Likes Received:
    7
    Whatever happened to Cell Factor?
     
  9. Mentai

    Mentai New Member

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    So I could use my 9600GT as a physics card if I bought a 4870? Hmmm
     
  10. zabe

    zabe Perfect in my imperfection

    Joined:
    2 Jul 2007
    Posts:
    197
    Likes Received:
    0
    You can still download the full game here if you want to try it when the new NV drivers are out. Another cool game utilizing Ageia Physx is Warmonger which is also a free game with some levels like Cellfactor, and you can also download the full game here.

    They both run like crap (25 fps for CF and 5-7fps for Warmonger cos it makes heavy use of physx and without the appropriate accelerator the cpu just can't keep up...) in my 2.2 ghz core 2 duo, but i'm hoping to see an improvement next week with the new NV drivers for my 8600... we'll see!!
     
  11. Timmy_the_tortoise

    Timmy_the_tortoise International Man of Awesome

    Joined:
    28 Feb 2008
    Posts:
    1,039
    Likes Received:
    7
    I'll have to wait for ATI drivers supporting PhysX... Unless I can buy a nice cheap and cheerful 8600GT and stick it in my spare 4x slot.. Would that work? Could I use an Nvidia card for PhysX with an ATI card for graphics?.. Nah.. That wouldn't work, would it..
     
  12. Guest-16

    Guest-16 Guest

    We can try - it's a good question. I forsee that it'll be a graphical driver conflict though.
     
  13. zabe

    zabe Perfect in my imperfection

    Joined:
    2 Jul 2007
    Posts:
    197
    Likes Received:
    0
    Actually, that should be totally doable. If you buy a Nvidia card, you can use it for a mix of gpu/physics job, just gpu and deactivate physics calculations, or just leave it for physics. Now, I'm just assuming, but if you can do that with multiple NV cards (even though as the bit-tech news piece says, that functionality will be possible in the short term future, but not yet in this first release of the driver), it would be absolutely logical to assume that you'd be able to configure your nvidia card for physics while you use your ati card for graphics.

    In the end you would have bought nvidia hardware anyway, so I doubt they care what you use it for... then again, exactly as Bindibadgi said, they could be a real pain and force that functionality only when 2 NV cards are available, one for graphics, the other for physics, while if there's only one NV gpu they could force you to use it only for graphics... even though the cards would be perfectly capable of a NV/physics-ATI/graphics combination, it wouldn't be the first time for nvidia to limit functionality of some cards/configurations through their drivers...
     
  14. ComputerKing

    ComputerKing <img src="http://forums.bit-tech.net/images/smilie

    Joined:
    8 Sep 2006
    Posts:
    4,200
    Likes Received:
    36
    This is nice but ATI don't support this. So I was asking too, can I buy a 8800GT and make it act like PhysX GPU and I use my ATI card normally?

    I Mean ( 4870 Main + 8800GT for PhysX ) = This will work or no?
     
  15. DriftCarl

    DriftCarl Member

    Joined:
    2 Nov 2004
    Posts:
    601
    Likes Received:
    12
    im not sold on these physics processors yet. I have seen a few comparison vids and it didnt really get me more into the game. With the vanilla UT3 I still get hooked into the gameplay and start to sweat alot while running round shooting stuff.
     
  16. sotu1

    sotu1 Ex-Modder

    Joined:
    24 Aug 2007
    Posts:
    2,877
    Likes Received:
    26
    great news and a great development, a good stepping stone too! now let's get some games actually worthwhile to use this tech :)
     
  17. Timmy_the_tortoise

    Timmy_the_tortoise International Man of Awesome

    Joined:
    28 Feb 2008
    Posts:
    1,039
    Likes Received:
    7
    I propose that, as usual, Bit-tech do all the hard work, so we don't have to!

    Looking forward to the results of your tests, Bindi.. Well volunteered.
     
  18. zabe

    zabe Perfect in my imperfection

    Joined:
    2 Jul 2007
    Posts:
    197
    Likes Received:
    0

    I don't think you need ATI to support it. You wouldn't be configuring anything in the ATI. You'd use your ATI for graphics, which is what it was made for. Then you buy the Nvidia, and you configure that for physics instead of graphics (if they let you, but the card as such is perfectly capable). It's two different devices, one for graphics as it was intended from ATI, the other as physics processor, as nvidia now gives you the option. In fact, if you select the ati as the video adapter, and configure the NV for physx, there should be no problem in each card doing their thing, cos it's just 2 separate pieces of hardware running tasks independent from each other.

    Then again, I remind you about my fear: controling what users do via DRIVERS. Nvidia has done it before, and could very well do it again... hopefully not, fingers crossed!!

    Think about the following idea: you're playing UT3 against me (also love it!!) and we're in a team deathmatch or something. I just shot the hell outta ya and you're health's down to 10, but you're good at avoiding my missiles and you're running for your life. You find this wall and you bomb it so that the wall breaks and you can go in, where it's gonna be way more difficult for me to shoot you. Or once inside if you have time you could bomb the rest of the walls and leave all the debris so that I can't catch you, or at least you can slow my chasing you and end up being saved after picking some health. Wouldn't that be cool?

    That's the kind of new gameplay tactics I expect from physics, which is why I'm so excited about it. Apart from the more realistic feel of the whole thing, I think it'll change the way we play!!
     
    Last edited: 29 Jul 2008
  19. Omnituens

    Omnituens New Member

    Joined:
    5 Apr 2006
    Posts:
    954
    Likes Received:
    11
    for those with tri-sli boards, could have an sli setup, then a third card to do physics?
     
  20. zabe

    zabe Perfect in my imperfection

    Joined:
    2 Jul 2007
    Posts:
    197
    Likes Received:
    0
    The bit-tech article said that for now the driver will understand all X number of cards as ONE, so with the first release you'll get a 50-50 of graphics/physics processing. With later releases you'll be able to specify which card will do graphics and which physics.
     
Tags: Add Tags

Share This Page