Discussion in 'Article Discussion' started by CardJoe, 5 Dec 2010.
So what happened to the 6850?
DX9 is now crap and it's letting console/pc studios get away with ports, the sooner it's dumped the better. DX11 gives Studios no excuses.
I agree, I have a 9800GT and I often wonder how much extra performance I will get from a newer card.I am not that bothered about DX11 so I might see if I could get a GF280 for a bargin.
I can also see where you're coming from, but please reread my post. I think it's a stupid thing to buy (graphics) hardware for the "game du jour". I never said people were stupid. There's a subtle difference.
Also, nowhere did I imply that everyone should go out and buy high-end hardware. If anything, I was making a case for buying "generically good" pc hardware. Hardware, that will let you run most things in a tolerable way (but not necessarily perfectly).
Yes, I am...
My 5770 Vapor X plays this mighty fine at 1920x1080 with 2xAA and 16xAF at max details. Only experienced slowdown on a couple of occasions. These games on old engines can keep coming as long as they can get them to use quality textures.
LOL, I got the CoD Black Ops Edition of EVGA's GTX 580, with zero interest in actually playing the game. I just liked the cooler sticker better then the normal Superclock Editions cooler sticker. Same card otherwise, same clocks even.
No 3 display/eyefinity/surround gaming numbers?
agreed and they make it sound like AMD drops by A LOT and when you look at the numbers its not that far behind and if you didnt have a FPS counter going on the corner I bet a person couldnt even see/notice a difference.
Also I didnt notice but what AMD Drivers where you running AND did you have the latest gaming profiles installed?
Are you serious? An iPhone can run this.
Black Ops lacks DX10 and DX11 features so what are we proving here again?. This is nothing more than a console port so why bother unless Activision or Nvidia paid you guys which i suspect. . An intergrated HD 3300 can run this over 30 FPS with 8x AA 16x AF.
You should of did this to Metro 2033 with DOF and Tessellation on instead of this trash of a game with it's antique engine. What a waste of print.
I thik one of the bit-tech podcasts said it best. COD will run on a toaster. Clearly BLOPS is no different.
I have similar feelings about your post.
BLOPS is the hot ticket, fast selling game right now and we wanted to demonstrate that it's not a hardware monster that you need to upgrade for.
They're very specialist setups in the vast minority. Sadly we don't have time to test everything. The same goes for 3D performance. We might look at these setups again the future though in their own article.
Cat 10.11 iirc. The article was meant as a lighter read, so we didn't want to include a test setup page. We tried for a good while to get the HD 5970 working properly but it just wouldn't - the card was using both GPUs but the results were even worse than those listed. In the end we disabled Catalyst AI, which produced the superior results in the graphs. As usual though driver tinkery will improve this, but that's standard for multi-GPU cards. Again, time constraints, and the fact we can't test everything or wait forever for driver profiles and beta drivers.
Being rude will get you no-where, and isn't appreciated, especially by mud slinging about our much defended non-bias. Seems to have been a slight mistake in missing the HD 6850 numbers - they'll be added tomorrow once I'm back in the office.
The title of the article is..... Again, it was meant as a light article to show that "hey, you probably don't need to upgrade your GPU play this game,"
Bit of an nvidia infomercial!
Meanwhile, back in the real world, anything over 60fps makes no difference... you can't see it. Given a 5770 will give you 50max and 28min fps at 1900/1200 with everything on, this article should have concluded that if buying a card for this game only/specifically then you be mad to spend you're money on anything else!
However, this article seems to think that a 460 1gb is the right choice WHILE NOT INCLUDING THE 6850 IN THE TEST ...AT ALL!
Piss poor Bit-tech!
I'm so freaking tired of reading this stupid non sense urban legend...
so gtx580 is the fastest one out there, very much expected.
5870 beating the confusingly named, crappy 6870, also very much expected.
a CoD article full of kids thinking they've got a point, another expected comments thread.
(i'll never cut 6870 any break, it has made me lost a lot of resale value on my old 5870. admittedly it's a good midrange card, but naming it to sound like 5870's upgrade is just f'ed up)
I think he was too rude about it, but i do share his opinion. As a long-time reader of the site, i know you guys are non-biased. However, advising for a GTX460 1GB while omitting the 6850 form the article (benchmarks AND conclusion) is just not right.
On top of that, you aren't releasing a hardware buyer's guide this month, because of all the shiny new releases next month, but you are happy to advise people buying a certain GPU for one game, while that particular game takes benefit from CPU more then GPU. It's all terribly inconsistant.
Now, I don't really find this too interesting, as i don't play blops and i know which GPU to get, so this article is not for me. However, i do mourn the decline in quality of hardware articles on my (still) favorite site. I'm frankly a bit underwhelmed by your reaction.
You can add the numbers on the 6850 to the article, but that does not fix the real flaw: this article is in direct contracdiction to your november buying guide, where you picked the 6850 over the 460. There can be reasons to pick the other card this time around, but without explanation, the article is flawed.
NVidia's last Top card was the 8800GTX since then they have been playing catch up with ATI, next question.
Yours in Buy the Best, ignore the Hype Plasma,
Yay! My 8800 GTS works just fine
So CoD:BLOPS will work on pretty much any graphics card from the last three generations, great, but what about consoles?
Seriously, I've tried everything to get CoD:Balls running on mine. Blowing in the slot, trimming down the strangely shaped and oddly thin cartridge, but to no avail. Getting good at all is taking AGES.
I was sure my 8800GT would melt if I bought this game, I might have to buy it now.
Was the test rig the one John_T listed? My GPU might be up to the job, but I am afraid my trusty olde E4500 won't run fast enough.
i would say that an article on cpu benchmarks for the game would be more useful since this game is a poorly optimised cpu hog mainly due to the fact that it's the by-product of console focussed development. GPU benchmarks yielded what i expected for this game as it appears to run just fine on most hardware.
Separate names with a comma.