Who thinks PhysX will take off and work well in conjunction with the video cards, out of curiosity... oh and what exactly do they do for benefit?
Ah, you mean the add on card/chip for videocards that completly takes over the physics? I think it would be a pretty good idea to include the chip on the card in higher priced videocards untill it would be cheap enough to be included standard on videocards. Unfortunetly, i dont think it will be too successfull if it is kept as an add-on card. Game developers might not want to waste there time coding for physx if it is not widely used.
Its abit of a catch 22 situation, developers wont waste time coding for them (as divine points out) if people dont buy them. But people wont buy them unless developers add support for them! I suppose thats one of the inherent dangers of creating a new product like this. Since the release of dual-core CPUS I would have thought developers might have taken this oppertunity to create such a system where one core handles geometry and memory handling, whilst the other deals with the hard core physics.
Well that is why ATI and NVIDIA need to add the chips straight to their high end models, this would allow a broader acceptance of the card. I have seen what the physx can do, and it really is amazIng, it would take a huge load off both the cpu and the gpu if ati and nvidia could get it on their cards.
True but added load on the card will be bad wouldnt it, or i guess 3.0GB bandwidth could handle it. Cooling though, we must think cooling too.
If it were integrated in the card, it'd be extra silicon, so wouldn't put additional load on the card. It would put extra load on the PCIE bus, but that's definately not a problem. Making it part of the actual gpu chip would reduce power draw and cooling requirements too, as nVidia and ATi probably use a much better fab process than the demo physics cards we've seen. I think inclusion in the high-end graphics cards is the way to get it going tbh. ch424
I think if they can sell them around the $150 price point they'll do okay. To me it's kind of like the X-fi X-treme music. As a fraction of the total system cost, the x-fi is pretty cheap and alledgly it makes a big difference (Can't tell on my cheap speakers). If they can keep the price around 150 and if it will fit in my PCIx slot (yes I mean pci-x, not pci-e) then I'll probably get one. <Minor Rant> I really wish PCI card manufacturers would read and use the whole standard for cards. There should be two holes in the bottom of a PCI card so that it will fit in a PCIx slot. Many manufacturers forget about that and just put in one slot, which means I can't use their cards </ Minor Rant>
IIRC there's support in 3dmark06 for it, and I seem to remember that Valve are planning to support it. If it's not too expensive by the time I build my next system, then I'll probably pic one up, I recon I'd spend up to £70 on one (if I was already spending £1k on a new system).
Not forgetting support for it in UT2K7, which will probably be the first major game to utilise the technology. And i agree that intergration with the graphics subsystem is probably the way to go in the future, but for now i think they have made the right choice to put it on a resonably priced card. If they bundle some nice games with it i think they might be on to a winner.
The code is being fully supported by most games companies... The Physics card will be standard in the next 5 years http://www.ageia.com/products/physx.html Clear synergy
I don't know about the others they list, but Cryptic's implementation for City of Villains (and supposedly heroes, unless they really are considering that "finished" now) was that the card would be used to take the load of extra eye-candy bouncing around the scene, wouldn't actually take the load of anything you needed to play or make frame rates any better. I think the only way they can mess it up and cause it not to take off eventually is to make it hideously expensive though.
At some point isn't this the same justification for the top of the line graphics card? Sure most games will play on older hardware, but people upgrade so they can get more eye-candy. On that basis alone, I think the phys-x cards will do well. I think the biggest challenge for the company will be answering the question "Whats next?". The gamer market, while large, is finite and unless they can continue to come up with bigger and better products, their furure is necessarily limited. None the less, I'll be ordering mine when they hit
And the entire Unreal 3 engine. UT07 won't be the only game to use it. I'm with everyone else - building it into GPUs is probably the way it'll go forward. Unless apps like CAD can be coded to make use of it. Of course, people that work with real intensive CAD tend to need the best GPUs around. While I'd love to get one, I'd need to see what the difference is. Whether it's framerate or eye candy or whatever. If they want me to spend $100+, it'll need to make a noticible difference. To me this could also translate to spare CPU cycles so I can be running even more in the background while gaming (as chances are most games that are coded for physx support will also be multi-threaded). I definately love the idea; any way that makes my games more realistic is OK with me. Or however it works out. With GPUs, we all know what they do. We'll need to wait to hold judgement for PPUs. Or indeed a bundle. If it's spending $200 and getting three games that are compatible, I'd be okay with that - it translates to the card being about $40 assuming you'd have bought the games.
I wonder if the physx card will be as revolutionary to computer gaming as the first Geforce graphics card?
If implemented properly then it could make the world of difference, however given that it's a bit of a catch 22 situation developers might not be putting as much time into developing sections of code for it to process as they would if it was already out and everyone had one. Given this it'll probably take a few cycles of games to trickle out with it, people to slowly buy it, more games to make use of it, more people buy it etc before enough people actually have one for it to make sense for the devs to spend a lot of time and money making the most of it (prob the same with multithreading to some extent - especially with it requiring quite a shift in programming technique/way of thinking).
If it has any chance to gain a high end gamers heart, it will have to atleast knock 20-50fps off the load of the card/cpu.
I think that it should turn out a success and personally, I'd like a dedicated physics processing unit. But then again, we shall only wait and see.