Discussion in 'Article Discussion' started by Tim S, 3 Oct 2009.
Half Life 2 was an ATi game if I remember....with the coupons because it wasn't ready.
They are supporting driver-forced MSAA. The option in game just enables the driver workaround that Nvidia implemented for all UE3 games. It was announced back at the G92 launch and it's the same capability in Batman.
I think you're getting confused.
PhysX is proprietary, because it only works with Nvidia hardware (I don't like the fact that the game physics are limited to one vendor... and neither do you guys!), but the work they do on TWIMTBP goes well beyond just adding PhysX (or support for AA). Quite a lot of it is fixing the game (Nvidia's developer tools are pretty popular and things like NVPerfHUD work on ATI hardware), getting the developer to implement a proper PC interface and actually adding more so that the port from the consoles is less of a port.
Developers often have a miniscule budget to spend on the PC version of their game because that's not where the money is and the publishers are all about ROI. They often have very little time in their development schedule for the consoles because of that and it's why some console ports are exactly that - they've had no TLC and some often have the console button mapping on screen (press A to start the game, etc).
SO nVidia invests money, and ATI can't be half arsed to do it, and they complain?
AMD = Advanced Marketing Drivel.
I think ATI forget why a lot of people buy powerful graphics cards...for games! The drivers and the optimisations are every bit as important as the hardware.
ATI invests money in making better hardware for the consumer instead of giving money to developers to lock out any features that might give ATI a greater edge in performance.
It's not quite as clear cut as that. Nvidia spends a lot of money with a lot of developers, but actual hard cash changes hands "very rarely" according to Tony Tamasi, who runs the group. Most of the time it is spent on developer support with engineers, developer tools, the Game Test Labs in Moscow for debugging code and much more.
AMD also spends money on content, but it's less widespread than Nvidia. Here's a couple of recent games that AMD has spent money on: Codemasters for bundling Dirt 2 (first DX11 game... I know Battleforge has DX11 content via a patch, but I'm talking about the first game to ship with DX11 in the box); Techland on co-marketing and bundling of Call of Juarez (first real DX10 game... yes, there was Lost Planet and CoH, but they were again patched or had a novel DX10 implementation).
AMD has worked with other developers, but it's not clear whether they're entering into financial agreements where money changes hands.
One thing that's worth thinking about is something an Intel exec said to me a few years ago (this isn't a direct quote, but the meaning is all there): "without great software, the hardware is nothing, no matter how great the hardware is". Nvidia's largest engineering group is its software team... there are over 1,100 driver engineers, 200 engineers working on content/developer relations (across HPC/CUDA, mobile phones, PC gaming, etc). I know Intel also employs more software engineers than hardware engineers (not including process technology) and I presume the same is true for AMD - they unfortunately don't talk a lot about it though.
Exactly, so while ATI may get a bit of a hardware edge from time to time, it's performance is hampered by dreadful drivers and games that are released without the driver optimisations needed to make them shine, whereas Nvidia and their labs work with the developers to get the hardware. You pay more for a Nvidia card, but at least you know the software is going to work well with it because the vendor has made the effort.
I wouldn't go as far as saying current ATI drivers are dreadful. Batman: AA aside we've actually seen very few problems with ATI drivers in the lab with the 4xxx series cards (except the HD 4870X2). Shoddy drivers are more of an excuse with the HD 3800 series and earlier. In fact we've actually seen more anomalies with Nvidia drivers in our benchmarks with GTX 275s outperforming GTX 285s and stuttering in games like Fallout 3 where ATI cards breeze through them.
Absolutely, we wouldn't say the 5870/5850 were the cards to buy at the moment if the drivers sucked the big one!
Yeah I apologize I was a bit confused. And at any rate, I'm only waiting for the GT300 to see if their image quality in AA/AF has improved. Once that happens I'll be pumping frames in Oblivion with max filters.
reading thew evga forums.. as you know evga is a nvidia reseller- and guys on the forum are getting new members to buy the gtx295 over the ati cards there =\ I mean there's fanboy.. and then there's those guys
I'll wait for GT300 series' cards and then I'll be in the market for a replacement for my 8800GTS/640. Before that I'm impressed by the HD5xxx series, especially the power draw [idle is w0000t!?] but I won't make a move.
PhysX in this title is also "The Way it's Meant to be Gimped". Beyond3D have a workaround here http://forum.beyond3d.com/showthread.php?p=1332461 that allows the PhysX load to be spread across all cores instead of being limited to just one. So if you have a decent quad or evan better an HT-enabled quad you should be able to get decent performance with PhysX running on the CPU. For AA just force it in CCC, but honestly, the last game I had to do that on was TCOR: Butcher Bay.
As I see it this is a way for nVidia to feed or revive the (now outdated) assumption that you need to wait for drivers and patches for games to work well on ATi hardware. TWIMTBP is a good thing in general but it loses it's value when it is used to promote nVidia hardware (which is now a generation behind) by limiting the features available on alternative hardware whether it is PhysX on the CPU or AA on the GPU.
NVIDIA: The Way it's meant to be Gimped.
Basically ati says that it is unfair for NVIDIA to invest money on its end to make games play better on its hardware and believes that NVIDIA should also make the games work on ATI hardware. Seeing as how ATI owns NVIDIA this clearly is logical and therefore NVIDIA is not supporting its parent company. But lets not forget that AMD also owns ATI and therefore NVIDIA and so NVIDIA should make it work better on AMD cpus and becuase Intel owns all of them NVIDIA needs to add special i7 support. But most importantly becuase all of these companies use silicon to make their chips I think that it is logical that sand also be supported in the games by NVIDIA and becuase Parrotfish make sand in tropical reefs clearly NVIDIA needs to focus on the intrests of the Parrotfish above all else. In face all game development should be delayed and all projects from all companies until the needs and demands of Parrotfish are discovered and provided.
Only in this way can everything be fair! Thank you ATI for stating the clear and true logic stated above. How could we think of this without you
It's more like nVidia really had no reason to test it on ATI. Why in the world would you do your competitor a favor by fixing stuff to work with their hardware when your job is to implement it to make sure your hardware works?
I see it as a case of butthurt because AMD can't get their sh*t together.
Like how Opera complains IE ships with Windows. Get the f*ck over yourself and do a better job at marketing your crap then FFS, don't sit there and cry like a little b*tch because you're not doing anything to improve the experience on your hardware.
AMD seem to expect things to fall into their lap.
it was the physx blocking on batman where they went too far imo..
I know folders love nvidia because it's just better.. and programs like badaboom and vreveal (I use these) are very nice to speed things up- something ati has to get on with dx11.. I'm positive nvidia could have had something out to match this card if they weren't so milky.. just be happy ati is around to keep pressure up, without them nvidia would be full of overpriced fail waiting on larribee to mature- glad ati's come out with a good one this time finally
I mean these arguments like on evga.. seen a guy just the other day buy a gtx295 on a mass forum recommendation over the 5870- he was asking about the two.. I mean that to me is strait fanboy- your buying a dx10 part that works off of sli and that somehow makes sense to these guys
and look, the 5870 caused them to turn around and attempt to make a 3 billion transistor part =] but then listening to them say it will be out end of the year- you gotta start wearing boots..
Your kidding right? I can remember problems with ATI's x1900 drivers, especially for crossfire. The performance was pathetic and I questioned if a dual card setup was right. I stuck in a couple of 8800GTX's and Nvidia have proved they can get there drivers working a damn sight better than ATI ever could.
I'm not a fan bot, I'm prepared to give either Nvidia or ATI another crack of the whip... but the way it's going I think I'll stick with Nvidia for my next purchase. PhysX isn't the show stopper it was promised to be but it would be handy to be able to support anyway!
But isn't this news bit all about shoddy drivers? While you guys have done a better job than most news outlets of dismissing their laughable claims and allegations against TWIMTBP, its amazing how much play this story has gotten as if somehow Nvidia has done something wrong. They've simply done what any successful company would, invested resources to support and improve their own products. Somehow AMD and their supporters think these benefits should automatically transfer to their own hardware as if Nvidia has some obligation to support them also. It makes no sense whatsoever.
In any case, it should be obvious this guy McNaughton has no credibility. To claim they couldn't get early enough access to RE:5 is not only a joke, but an insult to the reader's intelligence. I guess AMD is going to claim this one just popped up on their radar last minute, given its in the top 5 for console sales this year, been complete for months and already had a PC benchmark released months ago. Same for NFS: Shift, another highly anticipated title from a major publisher, EA, so really there shouldn't have been surprises there. Lastly there's Batman: AA, which has generated a lot of buzz for at least a year and of course is garnering deserved GOTY buzz.
For AMD to claim they didn't have time to work on these titles but instead spent their time and resources focusing on garbage features for garbage titles like BattleForge, Stormrise, HAWX, STALKER Pripyat under their own Get in the Game DevRel label is a slap in the face to their customers, plain and simple. They have poured money into Dirt2 for DX11 support which is a promising title, but I think its obvious Nvidia is better managing their resources with their selection of TWIMTBP titles as demonstrated by the resulting additional features and product.
FYI, this workaround simply reduces the processing load by decreasing the number of calculations used for PhysX, most noticeably collision detection. This leads to some unexpected and undesirable results....
But to better illustrate how poorly the CPU handles advanced physics calculations, you can see below:
Looks like batman stepped in some gum....then some toilet paper....hell I guess he just figured he might as well tar and feather himself for the fun of it. Also notice the horrendous frame rates. But that's what you get with these hack job workarounds instead of proper vendor support. You can run PhysX in full fidelity by turning off GPU acceleration and run it on the CPU also, you just got unplayable 5-10 FPS framerates even with the fastest CPUs on the planet.
Also, the common fallacy that Nvidia's TWIMTBP somehow hurts all consumers is clearly false, it benefits all Nvidia's customers which by any metric is the vast majority (2:1) of Gamers and PC gaming hardware purchasers. As for the AA issue, its more of the same with UE 3.0, if you own a UE 3.0 game having to force MSAA via the driver shouldn't be anything new, especially given UE3.0 has always required compatibility flags that aren't exposed normally for both AA and SLI/CF. For Nvidia users that means relying on nHancer until a patch flags those bits in a driver update. For AMD users, that means renaming your game .exe to UT3 or Bioshock until they get around to flagging the proper bits in their (hidden) game-specific profile.
rep++ for that! I lol'd.
You do realize we're in 2009, right? And it's pretty close to the end as well...
AMD's [oh, yeah... in case you didn't know: They bought ATi a while ago] drivers are a lot better than in the times where you could still buy new x1xxx cards whereas nVidia seems to have gone from near-perfect driver support to pretty awful in some cases and a-bit-better-than-average in the vast majority. Resting on their laurels it seems.
Separate names with a comma.