Discussion in 'Article Discussion' started by Tim S, 7 Nov 2005.
Looking good... it seems like nVidia are starting to pay attention to the sections of the market where they are really losing to ATi. This might be a good card for a system I am going to be building someone soon.
P.S: There's a typo on the DoD page, it says "of real world game play on the dod_anizo map" when the map name is "dod_anzio". Too much writing about anisotropic filtering, I guess
cheers for the typo - I've always read it as anizo
Too much AF, as you say.
updated with some places where it is in stock in the USA.
still forgeting those of us with lcd's bigger than 15"?
performance at 1280x1024 anybody?
case in point:
my sister has this hardware and is looking to upgrade.
and stuff just looks UGLY at 1024x768
so in fear.. what will be the best settings at 1280x1024? what is the performace difference by going with a 6800gt or faster?
I think having both APPLES TO APPLES and Best playable for all games at more resolutions is most usefull.
A2A is a hard fight, though, Evre. Look at FEAR for example. The AI in the game makes it almost impossible to have the same fight twice, regardless of how carefully you attempt to go through the map.
You can clearly see by the BPS graphs that 1280x1024 ain't on the list. That means that this card probably does not fit that market well with much turned on, and you need to look at going a size up. Bigz can't test every resolution with every bit of eye candy turned on or off...it would take forever. After all, one could sacrifice textures, lighting, shadows, etc. and go to minimum detail on everything and get the game to run at 1600x1200 on a 6600gt. But it will look like crap, and who would play that way?
The beauty of BPS is that 1024x768 is the highest he could get the game to run with most of the eye candy turned on. If you have a 1280x1024 and you don't want to run at 1024x768, these graphs clearly illustrate that you may need to choose from a higher bracket of graphics cards to run FEAR.
The performance difference between the two cards (6800gt and 6800gs) is negligable in real world play, unless somehow your eyes can detect a difference of 1 or 2 fps...that's the point of saying Best Playable...if something is 37fps or 39, it doesn't matter much...they'll still both be 1024x768 with X amount of eye candy on...those 2 frames don't make a better card.
So essentially, your answers are all there, you just have to look at what the graphs are saying. And there's even a couple A2A comparisons in the back based on the only games that have a well-enough repeatable section. Just in case you didn't pull the information from the other graphs for whatever reason.
You're looking at Low to Very Low detail for FEAR on either 6800 GS or 6800 GT in FEAR. I don't think FEAR looks particularly ugly at 1024x768, in all honesty... I think it'll actually look WORSE at 1280x1024 when you turn off things like shadows, lighting and ruin a lot of the creepiness in the game.
If you want to play everything at 1280x1024, you're going to need a 7800 GT - anything less won't play FEAR and look good at the same time.
If you want apples to apples in every game, along with best playable, I'd be happy if you could find me another 12 hours in the day - I could really do with 36 hour days.
Just to give you an idea of how long these reviews take - I spend anywhere between 6 and 10 hours with each video card (depending on how much tweaking is required). I got this card at 3PM Friday afternoon and I was working til 4am Saturday morning, up again at 9am, working til 3am Sunday morning, up again at 10am and then went to bed at 3am last night to start off again at 9:30am this morning and finished the review by 5PM. I would say there was around 30 hours of testing for this review and then another 10-15 hours of writing, creating tables, doing photography, etc.
I agree - FEAR does look better on high @ 1024 than at low on 1280 on my Samsung 17" LCD. I tried both these configuration before nVidia fixed the performance bug with the game in their latest drivers, allowing me to have the best of both worlds.
bigz: Everyone here appreciates the work that you and the Bit team put in; I find that the "best playable" style of review is extremely useful - the information that I need to know as a gamer is presented much more succinctly than with a large list of resolution/FPS/settings. Keep up the good work
Looks like this is the card for me! good blend of value and more power than the plain 6800 i was going to have to settle for. I just hope it clears what it has to in my x-qpack...
Sorry Bigz. After reading that again, I can see that it comes across in a very negative tone.
I do appreciate all the work that goes into doing a review.
and when I said 1024x768 looks bad, I meant on an lcd when its not in naitive resolution.
I originally thought that when BT did a review of a card, they were compiling a database of figures on that card. But with new drivers every month or so, i guess BT needs to rebenchmark every card to get current accurate results for a review.
It's no problem. I did actually delete some of the more harsh comments I made - I've got a very short temper at the moment with not being 100%.
It's really really hard to compile a database of 'best settings' for a video card in any particular game, because they can change quite dramatically with a driver update (or even a game patch). I would love to be able to do that, and I'd love to be able to put even more in to these reviews to make them even more useful than what I understand them to already be based on the feedback that we get.
Often there just isn't enough time to get done what I want to do anyway, which means I end up cutting things short. I hate doing it, I really do, because I'd like them to contain everything that everyone would ever want. I'd love to include gameplay evals on 'balanced' systems and a cost concious system too, but that's doubling the time taken to do a review without even factoring in that a slower system is going to require more tweaking than a faster system.
It sucks, I try and get as much done as I can, but what I will try and do in future is at least try and make reference to what is possible at 1280x1024 in the text. I am really pleased that EA/DICE have added 1280x1024 in to BF2 - I know it isn't the correct aspect for my CRT, but it is a resolution that a hell of a lot of people use. I'd rather be relevant to those TFT users as much as is possible but some games would look so poor at 1280x1024 that I'd argue that finding a sweetspot at a resolution that's a bit lower is the best option.
I thought it was a well written and though out review so nice job on that.
I just don't like the new 6800GS because I was planning on getting another 6800GT for now and it seems unless I get it fairly soon I will be out of luck
Thanks for your kind comments.
I think the best place for you to look would either be the for sale forums or ebay. SLI can be used with two different vendors and clock speeds now. In fact, I must test whether a GS will work with a GT.... I'll test that tomorrow if I remember!
the articles always rock! i enjoy the technical content and the fact that the benchmarks are actual games and utilities rather than obscure numbers and homebrew statistics.
Half-Life 2 has been taken out of the test loop . Still a good test IMO, even if it's not as up to date.
Don't see what the point of releasing more midrange cards is, besides building hype. The 6200 seems to have the OEM range covered, 6600GT/6800LE has the budget enthusiast covered, 6800GT has the midrange enthusiast, and 7 series has the high end. Regardless, good review
I tell you what gave me a laugh: some other sites have used 3DMark as part of their testing for this card, yet the score is virtually identical with the ATI competition which the in-game testing proves are far slower.
Shows how pointless 3DMark is as a guide - you'd think the GS was the same as an XL, GTO or 1600XT based on the 3dMark scores, yet it's 25-50% faster in many games using Real World testing...
Half-Life 2 has been replaced by Day of Defeat: Source - a 6800 GT can run the game at 1600x1200, as can an XL, and I'm sure a GS can too. I don't find it a great deal of use these days, in all honesty.
It is to reduce costs of production and offer something at a new pricepoint. It's already below $200, and that is a ruck load of performance for $200.
I think more and more what we are seeing by the manufacturers rolling out these mid range cards is companies filling out their product portfolio and Nvidia in particular showing how they can take a viable platform like the 6800 (and i am sure the 7800 soon) and in short order create a card to fill a void or sneak into territory where their existing range or price points don't hit or don't cover effectively. Ati has been hurting recently, their cards are good yes but they are just now implementing support for technologies that Nvidia has had a corner on the market for for months now. What concerns me even more is that their offerings while still "good" are not earth shattering and the delays and supply shortages show that they are scrambling to not only get a viable product to market but to actually have a product available when it is supposed to be. The second part is interesting because ATI is an OEM as well as just a chip supplier.
This card is already $50 under MSRP in the states: http://www.clubit.com/product_detail.cfm?itemno=A9602793
That is a ruck load of performance for $200.
Prices still the same in the UK - nobody has really made a move on it yet.
Great review on the card. I wish would have waited a bit instead of buying a GTO card. But if i did that i would need a new motherboard. I will wait and see what I can get after chrismas. Maybe an Opteron 165(939 Dualcore) with 6800GS. Since the 165 can be had for about $290 right now and is reported to overclock very well with people reaching 2.6ghz on stock aircooling. Paired with a 6800gs or may be 2 of them it would be sweet.
Separate names with a comma.