1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

News Mix ATI & Nvidia? Lose PhysX

Discussion in 'Article Discussion' started by CardJoe, 28 Sep 2009.

  1. CardJoe

    CardJoe Freelance Journalist

    Joined:
    3 Apr 2007
    Posts:
    11,343
    Likes Received:
    292
  2. yuusou

    yuusou Well-Known Member

    Joined:
    5 Nov 2006
    Posts:
    1,585
    Likes Received:
    121
    You wrote AMD bought Ageia, you mean nVidia right?

    This is sort of expected, tho I never really ever heard of anyone doing this. Anyhow just get one of the first nVidia integrated Physics and use an older driver... I guess.
     
  3. Bauul

    Bauul Sir Bongaminge

    Joined:
    7 Apr 2007
    Posts:
    2,173
    Likes Received:
    38
    It was always going to be a long shot, but it's still a shame.
     
  4. Hugo

    Hugo Ex-TrustedReviews Staff

    Joined:
    25 Dec 2006
    Posts:
    1,384
    Likes Received:
    19
    I'm more surprised to learn that support was there in the first place, than that it is being taken away.
     
  5. mi1ez

    mi1ez Active Member

    Joined:
    11 Jun 2009
    Posts:
    1,395
    Likes Received:
    13
    Grrr!
     
  6. Rkiver

    Rkiver Cybernetic Spine

    Joined:
    23 Apr 2009
    Posts:
    930
    Likes Received:
    42
    I remember asking this a few weeks ago on here, no one could say for sure. I emailed Nvidia, and they said "Wont work" but wouldn't say why.

    Well now I know. Cheeky buggers.
     
  7. Jack_Pepsi

    Jack_Pepsi Clan BeeR Founder

    Joined:
    24 Apr 2006
    Posts:
    646
    Likes Received:
    11
    Son of a bitches!

    My friend and I were discussing this only the other day as I want to bung a 8400GS into my system so that I could use it as a PhysX card. He told me that nVIDIA are stopping that from happening but I part of me didn't want to believe him. Hopefully Lucid will release an add-in card of their Hydra 200 or something as I doubt we'll see a 775 board with the chip on board.

    I hate you nVIDIA, I hate you!
     
  8. Mentai

    Mentai New Member

    Joined:
    11 Nov 2007
    Posts:
    758
    Likes Received:
    1
    I was going to do this when I upgraded from my 9600GT to my 4870 1gb, however the 9600GT was such a significant upgrade for a mate of mine I ended up just giving it to him. I figured getting the drivers to be happy with each other would be too fiddly anyway, but it's still a shame now that there are a few decent titles getting physx treatment (just played through Batman).
     
  9. Rkiver

    Rkiver Cybernetic Spine

    Joined:
    23 Apr 2009
    Posts:
    930
    Likes Received:
    42
    Well I've an Aegia card en route to me, wonder if that'll do it?
     
  10. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,204
    Likes Received:
    143
    Why are you angry? It's normal business!
    McDonald's wouldn't start selling Whoppers just because some customers don't want to go to both "restaurants".
     
  11. Rkiver

    Rkiver Cybernetic Spine

    Joined:
    23 Apr 2009
    Posts:
    930
    Likes Received:
    42
    Analogy fail. Seriously how is it even remotely the same.

    You have two graphics cards, one nVidia, one ATI. nVidia decide to purposefully disable a feature on their card if it detects an ATI one. That is not the same as McD and BK not selling each others items.
     
  12. ano

    ano 8-bit Bandit

    Joined:
    11 Sep 2009
    Posts:
    30
    Likes Received:
    0
    That's a poor analogy perplekks45, it's more like you go to the D's to get a big mac then go round the corner to BK to get your fries but if you eat them together you get diarrhea.
     
  13. Gareth Halfacree

    Gareth Halfacree WIIGII! Staff Administrator Super Moderator Moderator

    Joined:
    4 Dec 2007
    Posts:
    10,442
    Likes Received:
    754
    Ooopsie - that's exactly what I meant. Cheers for pointing that out.
     
  14. tejas

    tejas New Member

    Joined:
    30 Sep 2008
    Posts:
    101
    Likes Received:
    0
    You are all getting angry for nothing. Nvidia own PhysX and they can do what they like with it. Same as Intel with Havok. AMD have not had a single GPU Havok accelerated game, because why should Intel help their competitor.

    Just get over it with all your constant moaning.

    +1 for perplekks45
     
  15. kylew

    kylew New Member

    Joined:
    28 Apr 2007
    Posts:
    214
    Likes Received:
    2
    That's not the same thing at all.

    It's like McDonald's trying to tell you what you are allowed to do with your food once you've bought it. "No eating our Burgers with KFC chips"

    As if they could even begin to think of saying that. Once you've bought your food and left the shop it's none of their business.

    Same for nVidia, it's not any of their business what card you're using for rendering games.

    Imagine if M-Audio suddenly decided that their audio cards had to be disabled if they realised your midi keyboard was from behringer?

    It's about time nVidia get slapped with an anti-competitive lawsuit. They've been getting away with far too much lately.
     
  16. perplekks45

    perplekks45 LIKE AN ANIMAL!

    Joined:
    9 May 2004
    Posts:
    5,204
    Likes Received:
    143
    Okay, maybe my analogy was weak. (I'm at work, damnit! Don't have overly much time to write that stuff up :p) Still I don't see why nVidia should act any different.
    They own the rights to PhysX so it's their IP and they're allowed to protect it. If they don't think one of their cards should be used as a dedicated PhysX card then they can stop including it in their drivers. People who really want it just use older drivers and stop whining or grab an old Ageia card.

    After all this decision affects less than 1% of the graphics market. Big deal.
     
  17. kylew

    kylew New Member

    Joined:
    28 Apr 2007
    Posts:
    214
    Likes Received:
    2
    Remind me of who HAS had a Havok GPU accelerated game? :rolleyes:

    ATi GPUs (AMD) were the first ones to demonstrate GPU accelerated Havok physics which is to be ran using OpenCL and direct computer on ANY capable GPU.

    There are only video demos of GPU Havok Physics.

    They can do what they wish with their software or property within reason. What if they suddenly decided to issue a driver that severely gimped their older cards to make their newer ones appear faster in comparison?

    Perfectly fine according to you.

    When you buy something you should be able to do what ever you like with it.

    When they're taking something away when competition's hardware is detected, that is completely wrong.

    It's none of nVidia's business what other hardware is in your PC.

    I don't see iPods needing Macs to charge or interface with.

    Of course, if it was an issue that effected you, I'd bet your stance on this would suddenly become the opposite. :rolleyes:
     
  18. Greentrident

    Greentrident Member

    Joined:
    10 Jul 2009
    Posts:
    508
    Likes Received:
    15
    The logic here is that people are more likely to buy additional Nvidia cards if the function is disabled with ATIs. If this was a mainstream feature then that might work but for something only used by a handful of people who will probably simply remove their old Nvidia card when they upgrade to ATIs it won't affect sales in the slightest. So it seems like it is just a way to annoy a few people and illustrate a level of anti competitiveness the EU shouldn't tolerate!
     
  19. Comet

    Comet New Member

    Joined:
    21 Jan 2009
    Posts:
    27
    Likes Received:
    0
    If you ask me this is one hell of a stupid move by NVIDIA.
    They loose customer satisfaction and sales for this move. Supporting physx is already hard if you take into count that you need an NVIDIA card to use it. Even more so if you are stuck to NVIDIA cards forever. And with Directx compute becoming a norm even more so.
    Even if they don't give support for allowing physicx work along side any other graphics card they didn't need to disable physicx for non nvidia cards. A simple warranty would be sufficient to save them from the legal side.

    But there is another thing. I think NVIDIA isn't quite ready for the next gen. Why? Because of the letter they sent to the press that there was little to no interest in DirectX 11 right now and that it didn't make sense buying one of the new ATI cards.
    NVIDIA has just seen what ATI's next gen card is, and sends this type of statement? I think they did this because they have compared ATI's offering with what they are internaly building and realised that ATI is better. So they're trying to pass the idea that next gen graphics card isn't worth it.

    I got two NVIDIA gtx 260-216 in SLI. I am planning to upgrade to a DirectX 11 graphics card once the two companies have their offerings on the table and when a decently priced Directx 11 card is able to surprass my current system. I am planning to use one of the GTX 260 on a media center PC I have. And the other one I would use for physix with a future DX11 card. But this way I think I'm going to sell it.
    There is no point in maintaining a second graphics card just for physix if you have so many limitations.
     
  20. kylew

    kylew New Member

    Joined:
    28 Apr 2007
    Posts:
    214
    Likes Received:
    2

    But that's the point, when you have a geforce card as the main renderer, then it's fine, you can use a secondary geforce card for PhysX.

    Loads of people use their old 8800GTs/9600GTs for PhysX with the latest cards for the rendering side.

    It's something nVidia actively push.

    They just want to give the impression that Geforce means PhysX.

    This is why proprietary standards fail quick. Only part of the market can make use of them.

    Until there is an open Physics API that runs on all GPUs regardless of the manufacturer, PhysX will never be more than a gimmick.
     
Tags: Add Tags

Share This Page