1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Performance Update - ATI CrossFire

Discussion in 'Article Discussion' started by Sifter3000, 24 Nov 2009.

  1. Panos

    Panos Minimodder

    Joined:
    18 Oct 2006
    Posts:
    288
    Likes Received:
    6
    Maybe should compare the items and their costs too. I mean you can cheaply make tri-sli with 5770!!! and still worth as much as a 5870 or GTX 295! What's the speeds then? Maybe on an overclocked Q6600 at 3.2?
     
  2. Matticus

    Matticus ...

    Joined:
    23 Feb 2008
    Posts:
    3,347
    Likes Received:
    117
    Because people do game at these resolutions. And unfortunately not everyone realises that to make the most out of an amazing card or pair of cards you need to pump up the resolution. So really they are doing the less educated among us a service by letting them know if you do game at 600x480 then sli or crossfire should be the lowest down on your upgrade list.

    Also it serves as a good comparison against the higher resolutions.
     
  3. Phil Rhodes

    Phil Rhodes Hypernobber

    Joined:
    27 Jul 2006
    Posts:
    1,415
    Likes Received:
    10
    I don't understand the issue here.

    Hoary old gamers will remember that SLI originally stood for scanline interleave, which meant that each board was rendering alternate lines. Now, this does mean that you have to maintain the entire geometry for the scene on both boards and you can't do any sort of fustrum-based optimisation, but then I don't see how you could do that on the sort of variable-split approaches that are currently being used. Two adjacent rows of pixels are clearly likely to have very similar complexity.

    I presume there is some implementation problem with doing this of which I'm not aware, because otherwise it makes perfect sense.
     
  4. Unknownsock

    Unknownsock What's a Dremel?

    Joined:
    13 Jul 2009
    Posts:
    444
    Likes Received:
    1
    The new 5870's shine at higher res, It seems abit dull that you don't review this area. I know most people won't use this type of res but it does still show how well it will perform.
    Do benchmarks @ 5040x1050 and 5760x1200 and then you will see how well these cards perform.
     
  5. javaman

    javaman May irritate Eyes

    Joined:
    10 May 2009
    Posts:
    3,673
    Likes Received:
    104
    +1 for 5040x1050 and 5760x1200. ATI are pushing insane resolutions, it would nice to see if a single card can play these and their CF settup will provide any benifit. I know that the only reason ill upgrade from HD4870 is to play at these resolutions. already building towards it by buying a 2nd monitor.
     
  6. Kúsař

    Kúsař regular bit-tech reader

    Joined:
    23 Apr 2008
    Posts:
    317
    Likes Received:
    4
    Anything above 50% is pretty impresive improvement(even though it doesn't sound like much). Reflections, all sorts of shaders, dynamic lights(shadows) etc... - there'll always be certain amount of processed data required by both GPUs to render "their" part of screen. It's definitely not as easy as splitting screen in halves or lines. That's why I think graphic API with multi GPU rendering in mind might reduce unnecessary re-calculations or even dynamicaly divide screen regions between GPUs.

    I doubt we will see 100% improvement like in case of old 3Dfx cards. Quake-age is the past, just like static lights and flat surfaces...
     
  7. Baz

    Baz I work for Corsair

    Joined:
    13 Jan 2005
    Posts:
    1,810
    Likes Received:
    92
    We only do 1280x1024 in STALKER as we've never hit the CPU cap before - obviously a pair of HD 5870's saw to that soon enough.

    1680x1050 is by far the most popular wide-screen desktop resolution, hence we include it. Remember in games like STALKER and Crysis, performance is still capped by the GPU for the most part, so it's still relevant.
     
  8. hrtz_Junkie

    hrtz_Junkie Controversial by Nature

    Joined:
    25 Jul 2009
    Posts:
    30
    Likes Received:
    0
    Multi gpu takes another bashing!!

    I have to say that allthough I do agree that sli/crossfire still needs improving, there douse seem to be a lot off unecasairy negativaty.

    I use sli (duel gtx280's) and I have to say that this set up allows me to game at 1920x1200 will all settings on max (exept for AA) in any game I have tried so far,,,,,

    including, crysis, warhead, wars.stalker c s. Dow 2. dragon age. mass effect.c o d, Mw, Waw. x3. Ms flight simulator x,accelerator,sacred 2 ect.

    Know i know sli dous't represent value for money, but lets face it neither douse spending £300+ on a high end gpu, or£1000 for a multiplyer unlocked cpu.

    These are "high end product's" for people who want the best and are not so focused on cost.

    There are also other benifits to sli (32x csaa) which dishappeares any jaggies completely! on older less demanding games.

    I for one love my sli setup and will probably continue until finantial problems (such as children for egsample) force me into the mid range!!
     
  9. rollo

    rollo Modder

    Joined:
    16 May 2008
    Posts:
    7,844
    Likes Received:
    126
    Got a 5850 recently to game at 1680x1050

    was impressed by performance

    Sli is for people with too much cash
     
  10. javaman

    javaman May irritate Eyes

    Joined:
    10 May 2009
    Posts:
    3,673
    Likes Received:
    104
    to be fair they did say why they wouldn't recommend it at this point

    1. It ran a good 30C hotter in the antec 1200 which won an award for its excellent cooling.
    2. There are CPU limitations
    3. There are driver issues with 2 games major titles and its well known that new releases have driver issues leaving you with 1 working card.

    On top of that they recommended the HD5770 crossfire due to avaibility, performance and cost over a single HD5870 and said
    The summery at the end was very balanced weighing up both pros and cons znd you conclude that they're multi-GPU bashing?? They also noted that the 285 scales very very well and would be recommended if they where not so hard to come by.

    after typing this out Im concluding that didn't even read past the 1st negative comment or the full look into why the results didn't scale as well. Your nothing more than a fan boi for multi GPU
     
  11. eek

    eek CAMRA ***.

    Joined:
    23 Jan 2002
    Posts:
    1,600
    Likes Received:
    14
    Indeed, the only reason I can think of is that maybe 3DFX (or another company) holds some sort of patent stopping others from using this approach? Or maybe the overheads of cutting it so fine doesn't work so well with the high resolutions of todays monitors? But then you could just make each card render a couple of lines in one go - as you say, the image complexity is unlikely to vary that much between rows of adjacent pixels.

    It'd be interesting to see a technical article on the approaches used by nVidia/AMD so we can see why they use the approach they do (whatever that may be)
     
  12. Phil Rhodes

    Phil Rhodes Hypernobber

    Joined:
    27 Jul 2006
    Posts:
    1,415
    Likes Received:
    10
    To be fair there are issues with doing very small or single-pixel slices with multi-core rendering (certain professional graphics and video apps of my close acquaintance suffer from this). Consider what happens when you want to blur or bloom (which is the same thing really) or do anything that involves spatially diverse pixels; suddenly, you need information from the other card to do the work, although I wonder if you could do those sorts of processes on the half-height images then interlace them without terrbily bad artifacts. I mean, bloom channels are often visibly aliased in any case.
     
  13. xaser04

    xaser04 Ba Ba Ba BANANA!

    Joined:
    27 Jun 2008
    Posts:
    2,266
    Likes Received:
    224
    How much was your HD5850?

    I bought my two GTX260-216 EXO cards for £245 all in. Performance is similar to (if not greater than, thanks to the clocks) a GTX295 (so ~ equal to a HD5870 depending on game) yet they cost significantly less.

    Your point stands if you look at certain setups granted, yet, such a blanket statement is ridiculous.
     
  14. Henk

    Henk Uninformed Opinionist

    Joined:
    16 Jul 2006
    Posts:
    79
    Likes Received:
    0
    I'm also all for this as I've booked a 5870 that should be coming next week hopefully, along with a dell 2209wa. If the dell is to my liking I'm ordering two more and an active displayport adapter...
     
  15. isaac12345

    isaac12345 What's a Dremel?

    Joined:
    20 Jul 2008
    Posts:
    427
    Likes Received:
    3
    I'm wondering, why are you guys still including COD-WAW in your benchmarks, apart from checking whether the ATI guys have sorted the multi-gpu issues or not, for the game? Its quite an old engine and a newer engine like the one used in NFS:Shift would give a better idea of the kind of improvements a newer design/card brings along. Also, wasnt crysis quite unoptimised(not well coded) as compared to crysis:warhead? If so, wouldnt crysis:warhead be a better benchmark or another useful benchmark?
     
  16. MorpheusUK

    MorpheusUK a Noob that knows something

    Joined:
    24 Sep 2009
    Posts:
    111
    Likes Received:
    3
    Like what I see, great report, but as a current single 260GTX user it would have been nice to see some 260GTX SLi action in there. I realise that it would be close to the 295GTX's performance.
     
  17. frontline

    frontline Punish Your Machine

    Joined:
    24 Jun 2007
    Posts:
    825
    Likes Received:
    12
    Nice article, think it sums up the pros and cons pretty well.

    I did find that a bit strange, as crossfire seemed to work pretty well on the 4800 series cards in this game. Plus COD 4 was notorious for actually offering a crazy amount of scaling on 2 cards in crossfire compared to 1.
     
  18. Elton

    Elton Officially a Whisky Nerd

    Joined:
    23 Jan 2009
    Posts:
    8,575
    Likes Received:
    189
    I'd like to see some benchmarks with only 4x AF, 8x AF and Maximum in STALKER CLEAR SkY.
    That would make me day.
     
  19. Jasio

    Jasio Made in Canada

    Joined:
    27 Jun 2008
    Posts:
    810
    Likes Received:
    13
    What? No QuadFire love? Where's the 5970 QuadFire benchmark for those who just want to get silly?
     
  20. bogie170

    bogie170 What's a Dremel?

    Joined:
    11 Aug 2008
    Posts:
    340
    Likes Received:
    5
    Please please please bit-tech,

    Any chance you can moan to ati about the poor minimum framerates in codwaw.

    I have posted on there website loads of times and never get a reply or any info.

    Maybe bit-tech's standing will get them to look at this issue?
     
Tags: Add Tags

Share This Page