Discussion in 'Article Discussion' started by Tim S, 8 Jan 2009.
LMAO, sometimes things are far more rewarding if you;ve had to shout at people to get it
) I was very diplomatic at all times but you know what they say: The squeaky wheel gets the oil! I really wanted the 285s just because they will run cooler and 2 in SLI probably will be better for me. I game in a uber geek high 5040x1050 resolution through 3 wide screen LCDs using a Triple-Head-2-Go. My computer just recognizes it as a really big 65" widescreen LCD. Who knows, when the prices on the 285s drop with the 212 introduction in a few months I might have justification to try TRI-SLI!!!!!!!
Not B2 or B3?
Power consumption on these new cards is getting crazy. Arguably we don't need this much power, better coding, with good engine scaling and more clever architecture would be a much more welcome change for me.
And I still say they look like VHS tapes.
I still find the design (dual pcb) to be inferior to the 4870x2 design and the price tag of course, so I would call this more of a tie then retaking the titles.
Your fanboism was pretty obvious in your first and second sentences.
Look at the benchmark results for every game, especially crysis.
Look at 2560 resolution. Notice how the 4870X2 rigs are faster? thats because the 448bit buses are NOT fast enough and do NOT keep up with the 256bit GDDR5 combo.
Nice job ignoring test results tho.
disregard that reply, totally misread.
"Far Cry 2 is a mixed bag for the GeForce GTX 295, demonstrating performance which is, for the most part, identical to an SLI GTX 260-216 configuration. While performance without anti aliasing at 1,680 x 1,050 is somewhat disappointing and clearly slower than a Radeon HD 4870 X2, turning on anti aliasing sees the GTX 295 overtake the 4870 X2 until we reach 2,560 x 1,600, when the 4870 X2 has the advantage at 0xAA, the GTX 295 the advantage at 4xAA and both perform very similarly at 2xAA.
While this might look like a hit for the GTX 295 though, it's nothing of the sort - who would want to play modern games on such expensive hardware and leave anti-aliasing turned off?"
Bit arse-about-face isn't it? Or is this a job application for The Inquirer ?
I have just seen the prices of these on OCUK and was going to make a comment on how they are overpriced etc right up until I checked the prices of the HD4870X2. The card I paid Â£360 for a month ago (when cable (USD/GBP spot was in the same ballpark as it is now ~ 1.50 (so that can't be used as an excuse)) is now Â£410. That is pure markup!
So at this price range (Â£430ish for the GTX 295 and Â£410ish for the HD4870X2) it does make the decision harder.
I spent quite a bit on an 8800GTS in March last year. Now I'm gutted. I knew things were going to change and get faster, but this is ridiculous!
Oh well, as long as I can play at 1680 then I'll be happy.
it's $450 now.... schweet........
700w for just 2 GTX295 cards interesting, 3 9800GX2 cards pull around 700w from the wall, how much power does 3 of these things pull
Folding results would be nice to put in as CustomPC do now
sigh this sounds so much like a 9800GX2. anyone else wondering where a 4850X2 would fit in when the drivers come out?
If red colour bars indicate average FPS, what do the green and blue bars indicate?
Do the HD4870 X2 and GTX 295 really hold practical appeal? I wondered this about the 8800 GTX/GTX+/Ultra, too. Powerhouse cards are just Concorde, aren't they - achievements for the sake of technological curiosity and enthusiasm. Anyone who spends £800 on their graphics solution alone needs a night-time visit from angry ninjas with sporks. Honestly. You can buy a car for that!
Word is ATI will NEVER enable sideport - it's a fantasy at this point. Check out aat and you'll se they have talked to their engineers. It's a big fat zero no go.
Better stick with preoven technology like CUDA and PhysX - and the PhysX is REALLY SUPER COOL - try the new free game Warmonger -
The power consumption numbers are very misleading. When reporting power consumption for a loaded video card you do just that, load the video card, not the entire system. Starting a game ends up with both the CPU and the GPU reaching the max, so the end results vary with whatever CPU was chosen. The reported results are thus useless.
The reason 700Watts is the minimum for 1 card is because there are so many junk power supplies out there, not because the card needs that kind of power.
Not very good reporting, not good at all.
Separate names with a comma.