Okay, I'm curious....which pci-e card (by either nVidia or ATI) offers almost identical performance to the old Geforce Ti4200? Eg if they went head to head in identical systems gaming would be much the same experience on either. The reason I ask is that I may upgrade this summer but I can't be sure of budget (still got to get a job and I doin't know how many weeks I'll be able to work for, all depends on Oxfords Term Dates) so a gfx upgrade may have to come further on down the line, however, as I'll be moving on over to AMD 64 I'll need a cheap new gfx card to replace my Ti4200.
You can't really match anything up to it to be honest. You are going from a DirectX 8.1 card to a DirectX 9.0 card, thus the graphical demands placed on the video card are much higher, as the API is far more advanced, longer shaders, more complex geometry, etc, etc. I guess a card that would perform similar to it would be something like the X600 Pro, but it's a fairly overpriced part. You could pick up a GeForce 6600 128MB for under £80, and possibly close to £70 I would suspect.
So...if I want a comparable frame rate in MOHAA of NWN with a DX9 card I need a faster card. Hmmm, so x6/700 level or nVidia's 6600. The x600xt is about £105-110, thats about what I paid for my Ti4200 back in the day I was hoping performance wise things would have come down in price, ho hum... Ohhh just found a x700pro for £94.73 thats looking a bit better, guess I'll have to shop around a bit. Is the x700pro comparable to nvidias 6600gt or is it less powerful?
depends on the title, but the 6600 GT is a very strong performer, in essence. It will also look much prettier in general too, not just the same frame rate. GeForce 4 Ti 4200 runs an older Shader Model 1.1 path (DX8), whereas Radeon X600, X700 GeForce 6600, etc use Shader Model 2.0 (DX9.0), or SM2.0b/SM3.0 (DX9.0c)... the image quality improves quite considerably between SM1.1 and SM2.0, but the difference between SM2.0 and SM2.0b/3.0 is fairly small on today's hardware because most of the SM3.0 features are too much of a performance drain on today's hardware if ya get me. For example, the HDR Rendering in FarCry that could be enabled on GeForce 6 series cards is a massive performance hit on a GeForce 6800 Ultra, nevermind on something like a GeForce 6600 Standard. You can read about HDR here for some clarification of the performance hit with SM3.0's features: http://bit-tech.net/article/145/5
Thanks for the shader 1.1-2.0 info I hadn't found that. I've been avidly following your reviews like a starving man looking for scraps so I knew about the insane tech demands of shader 3.0 functions. Reading reviews at Bit has made me very critical of other sites, they either don't turn on the eye candy (no AA or AF), choose resolutions like 1024x768 which don't give a true reflection of what the cards are capable of (especially at the higher res which actually tax and seperate the cards) or choose a selection of games which favours their favorite brand either ATI or nVidia. Many sites put a frame rate but don't state whether its the highest, the average or what the minimum frame rate is, its just a nightmare. There are just too many fanboys to know who to trust, a good indication is to look a what cards the site has reviewed in the past and what reviews they got, if they favout ATI cards or review nothing but them you've stepped into fanboy territory, leave your trust at the door please and bring salt in with you Bit's reviews give us a good indication of performance at all levels and you can buy a card knowing what to expect from it...would be nice if there were more, maybe a roundup but we all know you guys are stretched as it is. We get spoilt here