1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

NVIDIA's GeForce 6600GT on AGP

Discussion in 'Article Discussion' started by Tim S, 22 Nov 2004.

  1. Kameleon

    Kameleon is watching you...

    Joined:
    29 Apr 2003
    Posts:
    3,500
    Likes Received:
    8
    Also, at what point does bigz say that 12fps is acceptable? :confused: I'd say that you'd get a far better framerate if you turned the AA and AF down, it's absolutely killing your card. If you want to run high resolutions at a high image quality, get some newer hardware.

    Oh, and please don't make me load any more uncompressed bitmaps :waah:
     
  2. Almightyrastus

    Almightyrastus On the jazz.

    Joined:
    21 Mar 2002
    Posts:
    6,637
    Likes Received:
    1,260
    thanks for that bigz and Etacovda, that is better than I was expecting on the pricing. Dropped a post it on my monitor as a gentle hint to see how generous people are hehehe
     
  3. barry_n

    barry_n What's a Dremel?

    Joined:
    24 Aug 2004
    Posts:
    338
    Likes Received:
    0
    why wasnt HL2 one of the games used as a test? but no need to asnwer that just wanna know wht are the performance differences?
     
  4. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    there is a HL2 article in progress - I wanted to finish the game before I could provide a good evaluation of HL2 game play on these video cards. :thumb:
     
  5. Etacovda

    Etacovda What's a Dremel?

    Joined:
    4 Nov 2003
    Posts:
    305
    Likes Received:
    0
    Well, it better run hl2 well because im buying one whether it does or not :p

    Never the less, the joys of a monitor that only does 60hz at 1280 means ill be stuck at 1024 where the performance should be great ;)
     
  6. chunky_monkey

    chunky_monkey What's a Dremel?

    Joined:
    7 Nov 2003
    Posts:
    45
    Likes Received:
    0
    I am using....don't laugh.....a Geforce Ti 4200. I can play Far Cry, Half-life 2, Flatout and Unreal Tournament 2004 without any problems and HL2 looks amazing even on my card. NFS Underground 2 however seems to really hate my graphics card and runs like a dog no matter what I do, I even tried reducing nearly all the settings to minimum and running at 800x600 resolution. I know it has lots of whizzy effects (when I turn the setting up I can get a really pretty slide show going on) but it is taking the pee.
    I haven't been this annoyed trying to play a game since Flatout blue screened me when I tried to run it. Yes, blue screen, full on death lock up! Designed for Nvidia, the way it's meant to be played, just not for your nvidia card mate, it's too old, get an upgrade! :rolleyes: Turning off sun flares sorted that problem out.

    Did you find even the cards tested had a harder time running NFS Underground 2 compared to the other games tested as I am curious to see if it has a poorly optimized game engine or if it is just that my card can't handle the effects.


    AMD Athlon XP 2000+
    512 DDR 400 Ram
    Geforce 4 Ti 4200
     
  7. TMM

    TMM Modder

    Joined:
    12 Jan 2004
    Posts:
    3,227
    Likes Received:
    2
    your not the only one.. I run at about 30-40fps at 1024x768 on a 9800pro!
     
  8. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    the demo has blue screened on any NVIDIA card I have tried to use.

    It's really poor, it runs slow on an X800 XT Platinum Edition & 6800Ultra too, but seeing as I couldn't get flatout running propperly on NVIDIA cards, I wasn't left with many driving game choices. :(

    I'll have to do some more trouble shooting on that title at some point in the future.
     
  9. chunky_monkey

    chunky_monkey What's a Dremel?

    Joined:
    7 Nov 2003
    Posts:
    45
    Likes Received:
    0
    I am not sure if works for all nvidia cards and Flatout but if you set up your profile in the game then exit and look in /Flatout/SaveGame there is a file called options.cfg. Edit it in wordpad and change the entry
    Settings.Visual.SunFlare = TRUE
    to
    Settings.Visual.SunFlare = FALSE

    This sorted it out completely for me.
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I'll have a look, cheers for the heads up :thumb:
     
  11. Darkedge

    Darkedge Minimodder

    Joined:
    26 Nov 2004
    Posts:
    363
    Likes Received:
    0
    Why the hell did you decide to do the stupid apples and oranges scale for the review? You cannot compare non empirical data like that - it's makes this article almost useless.

    Having two different settings on the same scale like that - even though you point out that there are two different settings is confusing. If you do that - have the bars separate to each other to imply that they shouldn't be directly compared as such. I do understand the 'usable' settings of the games, but it still skews the data. When we are looking at two cards and how they perform compared to each other they should be using the same setting so you can easily see what card performs better.

    Shame Bit-tech is now throwing away good standard scientific practice/analysis and actually giving out results that end up looking biased. :wallbash:
     
    Last edited: 26 Nov 2004
  12. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    We've been doing this for a while, and many people appreciate it for what it is. It's not to find which is fastest, it is to show the reader, and consumer what they can expect to achieve when they actually sit down and play games - those things that you use video cards for.

    Why? You can see which card performs better, because it is capable of delivering a playable frame rate at a higher resolution and detail setting. Surely finding a playable frame rate is what gaming is all about, unless I'm very mistaken. Also, we choose to focus on image quality, because a card can look very very good from a numbers perspective, but low and behold the image quality is absolutely awful. For example, right now, I would prefer to run a Radeon 9800 Pro in NFS: Underground 2, because it offers far superior image quality to the GeForce 6600GT, and consequently delivers a better gaming experience.

    What's wrong with moving away from the norm? The current "pissing" contest that seems to dominate the graphics wars at the moment is getting rather silly, because once you get above a certain frame rate, anything extra is pretty pointless - it's 100% playable, it can't get any better than that, really speaking... or is there such a thing as 110% playable?

    Why are the results biased? The GeForce 6600GT AGP trounces the Radeon 9800 Pro and a very similar price, where is the problem in that? I don't see a problem anywhere to be honest. If there are issues with either video card, it is my duty to report them, it's something that many reviewers miss out - if there is poor image quality, it often doesn't get mentioned. Sorry for mentioning and dissing products that don't deliver everything that they should do. :blah:

    You're the first person to state that things aren't what the way they should be... We're not moving away from the way we do things now, but would it make sense to add an apples-to-apples page to reviews in order to show which card wins the "pissing" contest? I personally think that it would confuse matters, but I'm open to feedback and suggestions on this.

    The way that video cards render things nowadays is far from apples to apples, which is why we have moved right away from apples to apples. For example, in Splinter Cell: Pandora Tomorrow (not used in this review), NVIDIA boards use a completely different shadowing technique known as buffered shadows, whereas ATI use a method known as projector shadows, which can often return quite different results.

    FarCry is another classic example, the GeForce 6600GT makes use of Shader Model 3.0, which consequently improves performance, and thus image quality (in the way that we do things), as we can typically increase resolution/detail settings to allow the video card to deliver a better gaming experience. The Radeon 9800 Pro uses the ageing Shader Model 2.0... are you telling me that I should cripple the GeForce 6600GT, and use Shader Model 2.0 on it to represent an apples to apples comparison? In my opinion, that is bias, because you are not making use of a greatly improved way of handling shader instructions. The bottom line is that Shader Model 3.0 does not have a degrading effect on image quality, which is our number one concern. Optimisations, or other such tweaks which cause a drop in image quality that is noticeable to the gamer are something we do not agree with - optimisation is something we do agree with, providing that it does not force a degredation in image quality.

    I could go on all day as to why our method is best, but I will leave it at that for now.
     
    Last edited: 27 Nov 2004
  13. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Last edited: 29 Nov 2004
  14. play_boy_2000

    play_boy_2000 ^It was funny when I was 12

    Joined:
    25 Mar 2004
    Posts:
    1,618
    Likes Received:
    146
    I'm probably treading on thin ice here, but if you have the time bigZ, perhaps you could pick a standard resolution (say 1280x1024) and just run the game without any AA or AF to sadisfy some of the nit piks. The idea of best playable settings is great, but some level ground to compare things is also nice. Im also just interested as to how often the FPS dips to the minimum you supply? I personally don't mind dips as low as about 15 FPS just as long as its not popping up all over the place. In HL2 i find that I drop to 5 FPS till about 5 seconds after a loading screen and occasionally on the hoverboat i would drop to about 10FPS when going deciently fast and rounding a few corners and it didn't bother me in the least. Im not saying that giving the lowest FPS is a bad idea (in fact i think its great), but i don't think it should be taken into consideration when it drops to 20 FPS for a few seconds and not again. Also, does the bridge chip on the Nvidia card play any part in anything (IE produce lots of heat, limit OC'ing ect.)

    All in all, a great review bigZ, keep em coming! :thumb:

    EDIT...also: The link you provided for one in stock, is that a price we can generally expect for these cards? I just had a pop over to newegg and most of the 9800 pro's are quite a bit cheaper then the 6600GT. Granted i used a US baised store to compare but, Id be interested to see a price/performance raito once more 6600GT cards start to appear and an a 'good price' for one becomes apparent.

    oh and the only 6600GT AGP on newegg was $240 USD, whilst a 9800 pro with 256 bit mem bus was $185. The Nvidia one might have had some better games included tho.
     
    Last edited: 30 Nov 2004
  15. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I don't mind you asking and with sufficient feedback, I think it will be something worth adding in - at least for a couple of the more popular titles.

    The frame rates do dip quite often during the testing, and they would dip to these levels during very graphic intense scenes, which can happen at quite often in some of the sections I've chosen. You could, quite possibly increase my settings, slightly, but you would experience a bit of lag every now and again. I'm trying to focus on gaining the smoothest, best looking, frame rate possible. :)

    I'm considering implementing detailed frame rate graphs, but the problem is that the scenes we use can get very complex, so finding the happy medium between ultra detailed graphs and simple to read graphs is a tough one.

    I didn't see it playing any part in poor overclockability, the video card overclocked quite well. The bridge chip heatsink doesn't get any hotter than slightly warm to the touch.

    Thanks.

    I've seen mixed prices, Sparkle's tend to be very cheap and good value, but there tends to be no game titles included. The XFX 6600GT on AGP is priced at around £165 on Scan.co.uk, so I would say anywhere between £150 and £170 would be a reasonable price for the 6600GT over here.

    Over in the US, I think we're looking at around $210-$240 for the 6600GT, depending on bundle and who the board partner is of course. The 9800 Pro can be priced at (upto) $200, at least the last time I had a look at the egg. There were cheaper ones, but it depends whether you want a bundle or not. :)
     
  16. chunky_monkey

    chunky_monkey What's a Dremel?

    Joined:
    7 Nov 2003
    Posts:
    45
    Likes Received:
    0
    Glad to help :D Now you just have to deal with the fact that the game is very good but VERY hard when you get to silver level! I am close to giving up :waah:
     
  17. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    yes Silver is hard - I got through the bronze level in little over 2 hours :D
     
  18. Darkedge

    Darkedge Minimodder

    Joined:
    26 Nov 2004
    Posts:
    363
    Likes Received:
    0
    Ahh right I see your point about the pissing contest and playability settings but I have to say alot of playability settings are personal opinion. It is a good way to draw a line in the sand though - just not to my taste 100%.

    I still feel a one graph direct comparision will highlight more the issues as well as you current graphs - also pointing out more that the other graphs arn't strictly on the same scales.
    I'd also like some spacing - even a couple of pixels between a radeon on 0xAA and a geforce on 4xAA. If people are just glancing at the review they will notice more that they are on different settings.

    Just my 2c. Anyway - keep up the good work! :thumb:
     
  19. pr0xZen

    pr0xZen What's a Dremel?

    Joined:
    5 Nov 2004
    Posts:
    145
    Likes Received:
    0
    I think I see and understand what most people say here. I understand and generally respect others opinions and reasons, but that doesn't mean I accept them. But seriously - there is a point to this. It's somewhat a new path to take, giving people the results that show performance at the levels it will be mostly used.

    But then again, I understand very well what some here are saying; The line of performance difference is by doing so, getting kind of blurry. It would not be all too dumb an idea to show what these products do at approximately the same settings and levels. It's of course understood that doing so is limited by technology. If a product, like for instance here with 'Shader 3.0, cna make use of technology advantages, it should do so. It is after all a part of the product.

    Guess it all boils down to this; must there be one, or is there time and capability to give us both?

    Lets hope the XFX can show some muscle - it's should be here any time soon. I hope. Finally it is time to lay the GF4 Ti4400 to rest.
     
    Last edited: 13 Dec 2004
  20. El_JimBob

    El_JimBob Minimodder

    Joined:
    28 Dec 2003
    Posts:
    471
    Likes Received:
    1
    Got my XFX from Scan last week and so far it's been a mix of marvels and frustrations....

    The card is certainly potent, but very fussy about what drivers it uses. Going back to the Flatout issue mentioned previously, the demo would constantly blue screen on me as i approached the first bend of the first lap. Tried every single driver release (from at least October) and had no luck. The latest Omega optimised drivers (66.81) finally allowed me to play through the demo.

    Doom 3 runs perfect at 1024x768 with 2xAA and 2xAF.
    The new NFS:Underground 2 runs like a dream at 1024x768 (with ALL the eye candy turned on) and looks simply stunning.
    Half-Life performance is erratic - not sure if it's a graphics driver issue or an unfixed bug, but the game stutters like a good 'un on my machine. Several blue screens every time i play (as with Flatout, the Omegas stopped the blue screens, but the stuttering still remains) and the stuttering renders some parts of the game completely unplayable.
    And before anyone tells me about the patch that was released :D I did update it and it actually made performance worse (i'm using a console code at the moment to 'roll it back' before the fix).
    On the rare occasion where it runs smoothly, i'm getting 60+fps at 1024x768 with all details on full plus 2xAA and 2xAF - it looks cracking....

    Excellent card so far, but i'm deducting a few points until i know more about the HL2 issues....
     
Tags: Add Tags

Share This Page