Discussion in 'Article Discussion' started by arcticstoat, 8 Nov 2011.
Am I the only one who feels like this massive cooler is completely wasted on a 560 Ti?
Tech Report just had an article pulled, presumably for breaking an NDA...
'New 560Ti launches 29th November....1280Mb Vram and 448 Cuda Cores.
...so it may well be worth waiting till the end of this month if anyone is thinking about buying a 560..as the new specs will make it roughly the same speed as a 570, and supposedly the launch price will be the same as the regualr 560.
Why would you not get a 6950 DirectCU II... It's even 1 pound cheaper... and the performance is WAY WAY better - -" (Price from overclock.net not overclockers.co.uk)
2GB of ram in a mid range card which is designed to run games at 1680p and maybe 1080p is already pretty stupid... let alone the price... At 220 pounds, you should really just add 30 pounds for the KPA2 GTX 570...
If you are looking for quiet, just get a better case ~_~"
fixed that for you
Or you could put the results right here to back up your original statement....
BTW I am not saying you are lying etc but making such a statement should be backed up with evidence or at the very least a direct link to such evidence.
Bit Tech could you improve your games tested to incorporate more modern and therefore more demanding ones please The new CoD is perhaps not much of a need, since it will probably be quite similiar graphically, but many many people want to know how a card will fare on BF3, and many other sites post results that disagree, or measure by average framerate (pointless).
I've heard Shogun Total War is painful on systems, especially memory-wise
need more gainward availability in Canada
I stopped reading bit-tech for their GFX card reviews simply because they take AGES to come up with a new group of games to test.
Not to mention, only a goofball would test a 2GB card capped with 1920x1080 resolution. 2GB+ is meant for using at 1920x1200+ (try gaming on a 2560x1440 Dell U2711 at native with only 1GB... blech!). Higher resolutions are key here.
Too bad they stopped listening to their users a long time ago...
But would a 560Ti have enough grunt to play modern games at 2560x1600 anyway?
I think the point Bit-Tech were making is that it's odd to have 2GB of VRAM on a card that struggles to get good performance past 1920x1200 anyway.
Me. I like quiet. I have an R3 and would still pick a quiet cooler anyday over an extra ~4fps or whatever a factory OC would give.
I can't believe this wasn't tested on 2560*1440. The only real reason for 2gb is for very high resolutions, and it would be interesting to see if the 560Ti has the grunt (when equipped with 2gb VRAM) to keep up with more expensive cards at that res.
Tbh I don't see why the standard test doesn't include a 27" or 30" monitor, many people have them and it is a useful way of distinguishing cards.
There's little point testing a mid range card at super-high resolutions because in games such as Bad Company 2 and Arma II, it's simply not going to cut the mustard, however much RAM it has. Even a GTX 580 can barely achieve playable framerates here, a GTX 590 is what you should be looking at and even then the minimum drops to just over 40fps with double the RAM of the Phantom. What we conducted was a real world test pitching the card against tests we'd expect it to face. Finding the extra RAM did make a difference but the minimum frame rate was still unplayable would be pointless.
We're holding off updating our games until we've found a suitable set of benchmarks and the next gen cards have landed. At which time we'll be updating our drivers too - interesting hardware in the graphics market is understandably scarce at the moment and it takes a heck of a lot of work to retest everything. It's not something we can do every time a new game drops.
2gb only needed for 3 way screen setups, 560 only needs 1gb as it not be able run at the screen res to even use all of the vram up
Wait a second here, where did MINIMUM 40 fps at maximum detail with a rez of 1920x1080 suddenly become 'not good enough'?? What are you guys on over there?? 40 FPS minimum is MORE than enough. You want generally, an average somewhere around 30-40 for good gameplay. It'd be great to have 60FPS in everything, but isn't necessary to be able to still nail the headshot. If a card manages a range of 20-60 FPS that should be PLENTY to at least enjoy it. It may not be enough for professional competition, but in those cases, they're dialing down the fluffy graphics anyways.
I take any ARMA test with a VERY large grain of salt. That game has such a poorly optimized graphics engine in it, even though it's already 3 years old and tops out with Direct X v9.0 graphics it manages to bring a GTX580 down to 50 FPS?? In my line of work, we'd never be able to release code that bad.
Stick with the optimized and heavily used engines: Unreal, Gamebryo, Hero Engine, Id Tech, Frostbite, etc.
Agreed. Everyone is predominantly sitting on their hands waiting for the next releases, but that doesn't mean you can't move forward with the games list. You don't have to retest EVERYTHING, just add a new one or two to the list and drop some others that are old DX9 engines that no one really concerns themselves with anymore...
Sorry, I think you misunderstood me there - 40fps is what the GTX 590 3GB manages at those resolutions. That is all. The minimum we consider to be playable is 25fps but it seems highly unlikely that the Phantom, which costs less than a third as much, will manage a playable framerates.
ArmA isn't a tough game to run just because it's not as optimised as more mainstream games - the game worlds are enormous and very detailed, the AI is very hardcore - a whole load of stuff really. It also scales with more powerful hardware which makes it a good benchmark - other tough-to-run games such as Crysis and FSX don't scale nearly as well.
Unfortunately adding a single game to our tests means re-testing a considerable number of graphics cards - some of our tests are manual too meaning we have to play through it scores of times, with several runs to make sure it was consistent. At the moment there's over 30 individual test results for graphics cards at every resolution (90 tests in total if we do all resolutions) - dropping a new game in would mean testing all of these at around 10 mins a piece plus reinstalling Windows when we switch from AMD to Nvidia testing. This doesn't include using the game to find a consistent manual benchmark to start with either. In short, it's not something that can quickly be done every time a game is released, and especially not when there are new generations of cards just around the corner.
Fair enough, thanks for answering I still think that a 27" or 30" screen should be included in the tests, I know it creates more work but it is another way of distinguishing between cards and very useful for people who have/are considering a large screen.
now this is interesting..........
"Disappointingly, the Phantom posted similar results to other stock-speed cards we've tested."
Disappointingly? No, *expected* -- it's a stock speed card, you said that at the beginning. To expect it to perform better is a bit ridiculous. What I want to know is why more concentration wasn't given to the overclock performance. With a cooler like that (7C idle and 37C under load -- very nice) this card's niche is in overclocking -- perhaps not in getting the best OC, but keeping the chips cool during that OC. How about checking noise, thermals, and power draw on the card once the OC has been set? Then show a comparison of performance against its stock settings, and relate that to the cooling differences with other cards at OC settings.
To have done all that testing of the card at stock speeds was both foolish and a waste of time -- it was set to reference specs when you got it, you should have known it was going to give you reference performance.
I hope there's a follow-up article coming ...
All the benchmarks are a complete waste of time, all cards manage to get +30 frames, most of them do +60 frames, 30 fps is already smooth and more then 60 fps is only noticed by a small amount of placebo people.
Only a handful of games manage to get low(er) scores like BF BC2 and ArmaII, where ArmaII is a old very badly written DX9 game.
I can play BF3 on 1920 res with all on 'ultra' smoothly. And that is on a quadcore AMD with a GTX460. Maybe the 16GB of ram helps a lot who knows?
Separate names with a comma.