1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware AMD Radeon HD 6990 4GB Review

Discussion in 'Article Discussion' started by arcticstoat, 8 Mar 2011.

  1. enciem

    enciem Minimodder

    Joined:
    23 Sep 2009
    Posts:
    144
    Likes Received:
    3
    Because it shows how well the card scales against it's single GPU brethren and the fastest cards the competition have to offer. It also highlights that the performance only really comes into play if your running high resolutions or multiple monitors
     
  2. murraynt

    murraynt Modder

    Joined:
    6 Jun 2009
    Posts:
    4,234
    Likes Received:
    128
    I'm sorry Lizard but I have to disagree.
    Metro 2033 was one of the best games I played last year.

    You say it's dull, but yet Bit-Tech test Black ops.
    The cod series has been dull since cod 4 and has brought nothing new to the table at all.
    At least the developers of Metro got up of there lazy arses to actually think of something that wasn't another run of the mill game.
    I certenally think it has payed off.
    But I'm not here to argue what our definition of a good game is.

    Who buy's a 6990/580 GTx to play Balck ops, A DX9 game that could be played on a sub 100 pound 5770/280 GTX.
    The answer- Not too many people.
    You aren't seeing many of the Dx11 features that this card was built in mind with, like tessellation for example.

    People who buy these cards don't want to play games that are dumbed down for consoles with DX9, they wan't to see ARMA II with huge draw distances, Metro 2033, Crysis Warhead, Dirt two, Battle Feild Bad Company stalker.

    Show us what these card really shine at.
     
  3. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Interesting review, seems to scale a lot better than the 5970 did at launch, although with insane power draw under load. At least the idle power consumption figures are pretty impressive when compared to the single GPU high end cards.

    I'm not sure which i'd go for between this and a crossfire/sli setup, yes you would probably get a cheaper setup with 2 cards, however 2 cards in a confined space can cause its own problems (i tried running 2 4870's with a tiny gap between them a while back, second card was getting all the heat from the 1st dumped on it and there was insufficient airflow to cool it).

    Great option for eyefinity setups, however i guess the only point of picking one up for lower resolutions is to gain improved minimum framerates with all the eye candy turned on.

    Would have also liked to see some Metro 2033 figures, although i do get the impression that it is a poorly optimised mess at present.

    It will be interesting to see if Nvidia can get similar or better performance with 2 GPU's without pushing the power/thermals even higher.
     
  4. PingCrosby

    PingCrosby What's a Dremel?

    Joined:
    16 Jan 2010
    Posts:
    392
    Likes Received:
    7
    My pants have turned blue.
     
  5. Bakes

    Bakes What's a Dremel?

    Joined:
    4 Jun 2010
    Posts:
    886
    Likes Received:
    17
    Here's one for you: How many of each card do you think Bit-Tech gets? Especially with the top end models, there are often very few review samples available, especially in countries that aren't the USA. With Fermi for example Bit-Tech got one, and they broke it (lol). So no possible SLI testing there. Sure, with other cards there are plenty lying around the labs but if you can't give your flagship reviews with SLI testing, why give the others the pleasure?

    The point is that Bit-Tech reviews stuff - ie suggests stuff to buy. What the game benchmarks effectively say is that graphics card x can play the system. It's pretty universally acknowledged that Metro 2033 is a fairly bad game - it might stress the computer to hell and back, but for what it's worth noone plays it. If you're looking for a number, great - but then you're doing no better than 3dmark. Hardware enthusiasts, not Hardware oglers.

    To use your car analogy, relying on Crysis and Metro2033 is like relying on car performance figures up a 1:1 incline. Sure, you'll get a result at the end of it, but it won't be representative of the majority of future games.

    Reviewers started using Crysis because
    a) It stresses the system as hell
    b) It was the first widely used DX10 game, and was seen as the state of things to come.

    Three years later, and very few games are as demanding as Crysis. It's clear that the status quo is moving forward - but apart from the exception of Crysis (which has effectively proven to game developers that it is NOT wise to make games that are graphics-card-bottlenecked by all but the most expensive systems) there have been no games that have required a graphics card upgrade. Sure - there might be one around the corner - but it's pretty unlikely.
     
  6. Raven Yun

    Raven Yun What's a Dremel?

    Joined:
    8 Mar 2011
    Posts:
    1
    Likes Received:
    0
    Queue the CS:S benchmarks for the 590 then by that kind of thinking.
     
  7. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Hurry up Battlefield 3, then everyone will forget about Crysis :\
     
  8. kaz1989

    kaz1989 What's a Dremel?

    Joined:
    3 Mar 2011
    Posts:
    20
    Likes Received:
    0
    For £500 with that power consumption? Don't think so mate. Also, why are ATI cards so ugly? Is there little scope for changing the architecture of the board?
     
  9. play_boy_2000

    play_boy_2000 It was funny when I was 12

    Joined:
    25 Mar 2004
    Posts:
    1,528
    Likes Received:
    82
    Unless your running 3+ screen eyefinity this card is pretty much pointless, though I think the same of most SLI/xfire setups.

    The GTX580 uses an additional 60ish watts more than the 6970, so I fail to see how the rumored 590 has any hope at coming in under 500W unless they downclock the crap out of it.
     
    Last edited: 8 Mar 2011
  10. Farfalho

    Farfalho Minimodder

    Joined:
    27 Nov 2009
    Posts:
    427
    Likes Received:
    2
    Since I don't use multiple monitor setup (Although I'm willing to try), this card is totally out of reach for me. Buying a HD6950 2GB and changing the BIOS to the HD6970 2GB has more value for money to me.

    I'm proud that ATi/AMD managed to do such a monster but I'm truly disappointed about the Antilles switch voiding warranty. Come on, it's like saying your car has 150BHP but if you flip this switch you'll get 300BHP on-the-fly but you'll void your 5 years warranty. It's just plain stupid.

    I know everybody is thinking, if it's made around 2 6970 underclocked, could we get the same performance with 2x6970 standard and squeeze them? If anyone has the pocket to do it, please do so because I'm also curious to see how much ATi changed the card's layout to cope with the amount of power in a single PCB.

    I'm an ATi supporter and in the single gpu market I reckon Nvidia is better but more pricier. Hope that when I have to change my "old" 4870 1GB OC, the ATi range will have more value. Kind of broke my relation with Nvidia.
     
  11. Penfolduk01

    Penfolduk01 What's a Dremel?

    Joined:
    21 Jan 2011
    Posts:
    19
    Likes Received:
    0
    To be fair to AMD, there are several high-performance cars out there that have such facilities in one form or another. And their manufacturers tell you "don't push the red button" as well.

    Given the quite ludicrous power-draw of this card even on standard settings, only someone who really, really knows what they are doing is likely to be able to build a PC that can use the Antilles switch without the whole lot just crashing and burning. And half the things they will have to do to do that will probably void any warranties anyway. If not the card's, then that of the motherboard or PSU.
     
  12. Ficky Pucker

    Ficky Pucker I

    Joined:
    9 Jul 2009
    Posts:
    1,599
    Likes Received:
    113
    wow, so the card that is meant to be used with eyefinity setups (see the number of connectors on the card and the amount of ram, lol.) doesn't get tested in multiple screen setup.

    is this a joke ?

    other than that "great" review.
     
  13. Toploaded

    Toploaded What's a Dremel?

    Joined:
    28 Mar 2010
    Posts:
    371
    Likes Received:
    6
    I 2nd this, I waited till a sale to pick it up but if I'd paid full price I would have still been happy with the purchase. I only know 3 people that have it also (but I don't have that many peeps on my Steam account, tend to keep it for long time friends) but all 3 played it through till the end and enjoyed the game. I do love this site, but I do question their assumptions when it comes to gamers on both console and PC lately (at least, the ones likely to frequent this site)

    They say that games like CoD and Battlefield get the most clicks, but if they have never added Metro 2033, how can they assume people would be disinterested? (unless they did use it for some beachmarks and I missed that, in which case fair enough)
     
  14. will_123

    will_123 Small childs brain in a big body

    Joined:
    2 Feb 2011
    Posts:
    1,060
    Likes Received:
    15
    Sorry Andrew, but we don't test with Metro 2033 because it's a dull game that very few people play so it has very little revelance to the gaming market.

    Its not a dull game. one of the most under rated games out there. also it puts a good workout in the card and therefore should be in there just like ARMA 2 which beasts the card aswell.
     
  15. memeroot

    memeroot aged and experianced

    Joined:
    31 Oct 2009
    Posts:
    1,215
    Likes Received:
    19
    I still think comparing this card to only single card solutions is silly

    2x gtx570's or 2x 6970's would be cheaper, quieter etc. so are what it should be compared to (my sli issues aside)

    and yes it should not be put up against black ops - my old 9800gxt was fine with that in 3d....
     
  16. Waynio

    Waynio Relaxing

    Joined:
    20 Aug 2009
    Posts:
    5,712
    Likes Received:
    211
    Well at least I know my new case will take this crazy long GPU but I think I'll stick with my single GPU & have no stuttering or dual gpu problems, after the crappy time I had with the 4870x2 I am not interested in buying another dual gpu.

    AMD need to release a single gpu that really outperforms the gtx580 to get me interested, I'm not going to do sli or crossfire again neither, well actually I've only ever done crossfire twice & both times was disapointed, just 1 super duper single gpu will do me fine & the 580 is all I was hoping for to last a good while, although it might not be as good of a keeper as the 8800gtx was or then again it might since no new consoles coming anytime soon, I'm hoping because of the stretched out delay of new consoles more developers will look at the pc more as a first base for awesome games but thats probably wishful thinking :hehe:.
     
  17. leveller

    leveller Yeti Sports 2 - 2011 Champion!

    Joined:
    1 Dec 2009
    Posts:
    1,107
    Likes Received:
    24
    Never had any issues with mine. It's the best GFX card I've had so far. Sooooooo, either you had a duff card or something else was causing your issues?
     
  18. Paradigm Shifter

    Paradigm Shifter de nihilo nihil fit

    Joined:
    10 May 2006
    Posts:
    2,296
    Likes Received:
    81
    I'm afraid that 2560x1440 isn't actually a high enough resolution to display the issue properly. You're talking about just shy of 3.7 million pixels. I tested at 7.3 million. Doubling the pixel count has a massive impact on the effect.

    The most VRAM usage I saw out of GTA4 was 1495MB on 6064x1200 4xAA, so that would involve minimal swapping to system RAM with the 1536MB of VRAM that the GTX580 has. So in that scenario, the GTX580 wouldn't have run out of VRAM to the same level as a GTX560 or GTX570. Quite frankly, 40fps is still quite healthly; I'd be happy with that on a Surround game. You did not push settings high enough (you aren't asking it to render 7.3 million pixels) to cause framerates to drop enough. Bear in mind, that for Surround you've got two GPUs to do the work, which lessens (a little) the effect of loss of GPU power, but magnifies the effect of lack of VRAM in proportion. :)

    Crysis is, and always will be, a bit of a law unto itself. With 2GB GTX460s, I was seeing 35fps (avg) in Crysis @ 1920x1200 with no AA, and 1400MB of VRAM usage. At 5760x1200 it was hitting 1900MB, and 6064x1200 (bezel corrected) it was actually butting up against the VRAM limit imposed by 2GB cards. Crysis on Very High is seriously VRAM hungry in Surround. At 6064x1200, I was usually seeing around ~5fps in Crysis at Very High with no AA. 5760x1200 was a little better - 15fps. Now, the 5760 15fps was lack of GPU power; the ~5fps was a combination of lack of GPU power and running out of VRAM.

    DiRT2 is an interesting one. On 1GB cards, 6064x1200 4xAA results in a 1fps slideshow. On the 2GB cards, the same settings manage 46fps (avg) 37fps (min) and a VRAM usage total of over 1800MB.

    The problem is, every game reacts differently. They're all different engines. Some run out of VRAM and choke and die. Others take an ~50% fps hit. A couple of games, I couldn't get to care about VRAM limitations, they ran great regardless (Devil May Cry 4, Half-Life 2) But to get the issue to exhibit, you need to be running extortionately high resolutions. You need to be looking at multi-screen - single screens just don't do it. :)

    Agreed; however, tripling the resolution doesn't result in triple the VRAM usage. But it is game dependent. Some are efficient... others... less so.

    Thats a brilliant example; thanks. :D
     
  19. Claave

    Claave You Rebel scum

    Joined:
    29 Nov 2008
    Posts:
    691
    Likes Received:
    12
    I'd like to point out that our low opinion of Metro 2033 isn't an assumption: we gave it 6/10 when we reviewed it and none of us want to look at it after the review was posted:
    http://www.bit-tech.net/gaming/pc/2010/03/26/metro-2033-review/3
     
  20. Farfalho

    Farfalho Minimodder

    Joined:
    27 Nov 2009
    Posts:
    427
    Likes Received:
    2
    I do know about that and Bugatti Veyron has one that does such a thing but the catch is, just don't use it because you can crash badly, not void the warranty on the whole car.

    For that demographic, warranty is something that really doesn't matter but think about someone who likes to have the top of the range but doesn't fiddle with it (not wanting or not knowing how), that would be a big no no. I had a friend that bought a 5970 and doesn't know a thing about OC, his pc is at default. He had the graphics changed later because of the scaling problems.

    I just want to add, if board partners start making bundles with 6990, please include a personal power plant with it, ok? ta!
     
Tags: Add Tags

Share This Page