Discussion in 'Article Discussion' started by Sifter3000, 21 Jan 2010.
Is it me or is everyone misspelling AGEIA? it's not AEGIA. not even the nvidia guys know that? or am I missing some name change?
PhysX does what it says on the tin, regardless
Ha, good spot!
To be honest, after the fiasco with Nvidia disabling PhysX whenever an ATI GPU is present, I'm more inclined to believe what AMD/ATI have to say on the matter.
Exactly my thoughts Pete
His answer fails to address the Batman A. lock out issue. There'll be more in the future for sure.
Never mind all this squabbling, bring on fermi.
Maybe this is a good subject for an unbiased test?
I don't think AMD or Nvidia would be willing to show how they "know" but it should be possible to dig into it. Seeing that Nvidia has turned off standard direct x features from running on ATI cards, I forget the game, maybe there is something to this.
It might "support" multicore, but there is no doubt they are not giving PhysX-on-CPU users as much optimising and accessibility as those who have bought they're own brand of card.
which is, as it is their brand and software, their right. but if something is your right, it does not mean that if you do it, people are going to like you.
The only realistic solution was already provided in an answer by Huddy in the original interview:
Of course, non-committal on their obligation to support features for their own products.
In that case, you don't have anything to worry about, according to Chris Hook (AMD something something of lies, BS, misinformation etc) your ATI cards have supported physics since at least the 4890....
Because physics are important to them....really.
lol *as he inserts the multi core code back into the driver- look there it is!
Its been in the SDK since day 1, if you run Vantage its the part that y'know, pegs all available physical AND logical cores to 100%....y'know the portion that runs at 6-15 FPS because modern CPUs simply aren't fast enough to adequately accelerate physics simulations.
If it were possible to accelerate advanced physics effects on the CPU, don't you think Intel would've rolled it out with Havok by now?
Personally, i think Valve had the right idea using Havok to introduce some basic CPU powered Physics into the Source engine games, which enhance the gameplay, rather than dominate it.
all we need is a copy of the source code and we will definitely find out
(i think chizow is an nvidia fanboy!)
Knock yourself out, it may only cost ya $50K, but even if you got it, would you know what to do with it to "definitely find out"?
Perhaps, but at least I wouldn't be an ignorant ATI fanboy!
yeah was being sarcastic chiz.. and the code for 3dmark was always a bunch of marketing- I mean it artificially inflated scores for nvidia by rewriting code.. it's like hax- what's the point in a bench then.. like messing with the lod bias
the cpu bench used the gpu as well.. so it doesn't really matter if they claim it uses all cores- physx has always been a marketing tool for them- pretty much nothing that can't be done with havok on todays cpu.. think that's what the guy at ati was getting at
I'd rather be an ignorant ATI fanboy than an ignorant nVidia fanboy.. Had my share of geforces, happy with my radeon
Separate names with a comma.