1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware R680: AMD ATI Radeon HD 3870 X2

Discussion in 'Article Discussion' started by Tim S, 28 Jan 2008.

  1. Jipa

    Jipa Avoiding the "I guess.." since 2004

    Joined:
    5 Feb 2004
    Posts:
    6,367
    Likes Received:
    127
    Woah a bizarro card with two GPUs sorta.. kinda... beats the ~year old Nvidia card. Merry christmas to Ati fanboys. Most likely with proper drivers (are such an urban legend btw?) it will prevail in even larger number of benchmarks.
     
  2. menemsis

    menemsis What's a Dremel?

    Joined:
    30 Jan 2008
    Posts:
    4
    Likes Received:
    0
    Stupied & useless GPU

    waiting for 9800GX2
     
  3. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,987
    Likes Received:
    706
    it must be boring benching each game mannually one by one, but so much thanks for providing reliable data.

    seems like gameplay results are same as HardOCP, where the more than 1 year old G80 core is still better. this dual GPU on one PCB card is good for benchmarking (as other reviews showed) but bad for gameplay, where most of the time, with worse minimum FPS than even GTX.

    im now not sure about 9800GX2 is a good idea, just 2 G92 stuck together, may be more shader processing power, but will the drivers keep up? will nVidia's driver offer better minimum FPS figures? (IMHO, minimum FPS is what we should be looking at, where it represents the shutter moments, and if it's better than 20FPS, it'd be fine)
     
  4. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    No problem - it's more mindless than boring... but it's not as concentration-intensive as the best-playable testing used to be. When we changed back to the apples to apples tests, we didn't want to drop the real world aspect (indeed, everything we present, even if it's automated is representative of gameplay performance) - it almost just changing the way the results were presented. :thumb:
     
  5. Bladestorm

    Bladestorm What's a Dremel?

    Joined:
    14 Dec 2005
    Posts:
    698
    Likes Received:
    0
    I'm certainly not seeing a good reason to abandon my plan of upgrading to an 8800 GT for my 1680*1050 gaming, and given this I suspect nvidia won't be feeling too much pressure to rush out a new generation to beat it themselves either (though some is better than the none of a few months ago!)
     
  6. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    I thought you guys might be interested, I got one of these yesterday. slotted it in run 3dmark 06 and got this...

    [​IMG]

    Not bad for on stock settings with immature drivers i'd say, also from a first person visual perspective.
    the images on the whole, and in game are nice and crisp.

    qx6700@ 3.2ghz
    HIS 3870x2 all stock
    2gb crucial ballistix 6400
    asus commando

    my last card was a bfg 8800gtx oc2 version, I have to say already I prefer the 3870x2, saying that my experience with the gtx was not a happy one.
     
    Last edited: 31 Jan 2008
  7. Amon

    Amon inch-perfect

    Joined:
    1 Jun 2007
    Posts:
    2,467
    Likes Received:
    2
    ^ Good to hear second opinion on the reviewed product from a customer. Thanks.
     
  8. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,987
    Likes Received:
    706
    may i ask why?

    run more games, im interested in whether it will have compatibility issues with games. afterall, it runs on principle of crossfire (aka, software managing GPU power, not like a single 8800Ultra, pure power)
     
  9. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    Hi if your asking why, my experience with a gtx was not a happy one. The card was rma'd 6 times in five months. This is just my experience.
    I know many people who have had great experiences with gtx's.
    My theory is the retailer stocking my particular gtx got a faulty batch, [chipset maybe]the problem was if I left my pc on 24/7 the card was great. if I powered off over night
    the card would not post at all from a cold boot, like I said this happened 6 times in total, at the end of it all the gtx cost me close to £400.
    Fortunately for me the retailer refunded me, even though on the last one I was outside their refund period.

    So I was a bit nervous about going nvidia again. [btw] the issues described above were not related to my psu or any other factor] the cards were dead.

    This 3870 [back on topic] ive played crysis and cod4. crysis I played on defualt high spec settings, I ran the crysis benchmark tool. In game I didnt experience any glitching at all as reported in many reviews, though in saying that I didnt play it for very long.

    heres some results from the benchtest with the hotfixed 8.1 cat drivers..

    Completed All Tests

    <><><><><><><><><><><><><>>--SUMMARY--<<><><><><><><><><><><><><>

    1/30/2008 5:13:45 PM - XP

    Run #1- DX9 800x600 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 57.08
    Run #2- DX9 1024x768 AA=No AA, 32 bit test, Quality: High ~~ Overall Average FPS: 53.345
    Run #3- DX9 1024x768 AA=2x, 32 bit test, Quality: High ~~ Overall Average FPS: 37.56

    The resolutions are on low res settings due to running on a 17" monitor, ill hook it up to a 24" later and get a better idea.

    Im not certain, but I think those averages are not bad considering the drivers are basically beta drivers.
    Im running 8.1 drivers with a hotfix addon.

    [​IMG]

    I have'nt played many other games yet because there is an issue with my rom drive. Ill swap the drive over and get some results up later.

    heres a screen from crysis..

    [​IMG]

    Im looking forward to cf being enabled with up coming drivers which will technically be quadfire...

    need another card though..:sigh:
     
    Last edited: 31 Jan 2008
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I don't think I made it clear enough in the review: the 3870 X2 is by no means a bad card - it's good to see ATI back. The long term success of it is going to be driver support, which is why I cannot widely recommend the 3870 X2 above a single GPU card.

    bustamove: sorry to hear about your bad experience with the GTXs - it could have been any number of factors that were triggering the cards to die. I've had similar undiagnosable situations and that's when it sucks the most because you know you're not getting what you should be getting. :(
     
  11. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    Thanks Tim, The retailer confirmed that each card was in fact dead and replaced each and every one with a new card replacement, none of them could be repaired. [gtx] and were returned to bfg.
    I should have said before it was a bfg oc2 version gtx. I do believe there has been problems with that particular model. [could be the factory overclock maybe]

    When I put my old x1950xt in the gtx's place I got post immediately.

    as a comparisin, and as you rightly say Tim [for a single card] the gtx was getting 14750 in 3dmark 06 with the same set up as above. card was at stock settings 626/2000mhz.
    which actually closely matches ultra performance.

    For what its worth and I have no way of proving it. the image quality in general seems more defined and crisp with this 3870x2 than the gtx I had.
    The only reason I draw a comparisin to the gtx is because I very recently had one in the system Im using the 3870x2 in.
     
  12. wyx087

    wyx087 Homeworld 3 is happening!!

    Joined:
    15 Aug 2007
    Posts:
    11,987
    Likes Received:
    706
    that is precisely why i am againest multi-GPU card/setups.

    those 3Dmark 06 scores means little verses gameplay, as long as one is happy with their gameplay experience, there's no reason compare scores. and this is why i asked about other games, looking at Bittech's Crysis minimum FPS, i wanted to know whether you get a good gameplay experience. some game may experience glitches due to the fact that driver isn't well optimised (which comes back to the disadvantage of multi-GPU)
     
  13. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    @bustamove: It's great to hear from another person (in addition to the staff) with firsthand experience. :thumb:

    Are you running XP and if so could you tell me if span mode is available on this card. (I think ATI calls it "stretch" - basically the system sees two monitors, at the same resolution, as one larger monitor.)

    @Tim Smalley / Bit-Tech staff: Can you tell me what happens when you connect two monitors. I know that the 7950 GX2 ran in either SLI mode or single GPU mode. If I wanted two monitors it shut down the second GPU. :confused: What does the 3870 X2 do in the same situation?
     
  14. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I'm sure I mentioned it in the text (sorry if I didn't, must've cut it out in a final edit)... but multi-monitor works fine in both 2D and 3D. It's much more transparent than the 7950 GX2 in that respect.
     
  15. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    Played cod4 today and the image quality is truly fantastic! I only played in single player mode, no glitching or sketchy frames at all.
    very smooth gameplay.
    Is anyone here aware of a benchtest for cod4? is there a command or something?

    I really am impressed with this card so far. @ Tyinstar, im going to hook it up to my 24" monitor in the morning, my rig is watercooled and its a bit of a pain moving the pc around.
    not sure about the stretch thing but ill check it out.

    Im really happy i bought this card now!

    Can i quote something someone said on another forum? I would be interested to see what you guys say about it.

    quote; Both gpus have 512mb each. It has 1 gb total, but seeing how it's technically CF on a card, it has to store the same info in both gpu's memory banks, resulting in it functioning like it was only 512mb.

    also, quote; A LOT less latency than regular cross-fire.

    You see, regular cross-fire has to do a majority of it's communication through the chipset. Meanwhile this card has it's own splitter and does it's communicating all on card. As such, there's less latency all the way across the board.

    ALL current multi-gpu set-ups have to duplicate data for each gpu in ram. They have to, because both gpus need the same textures and shader information as they both work together to render the scene. Essentially they all run as if they had the same amount of ram as the lowest card does. That's just how things have to work currently, and why I believe the R700 will have to have an external memory controller for the MCM package.

    To sum it up, this card is like a lower latency CF set up. unquote;

    well I kind of see what the guy is getting at, but it doesnt seem logical. i.e. how, at stock settings does it perform very similar to a single card cf set up?

    any thoughts on this?
     
  16. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Crossfire / SLI copies all frame data to both GPUs and the scheduler chooses which frames which GPU renders. So yes, the card has 2x 512MB onboard memory partitions, but the data is the same in both partitions so it's technically just a 512MB GPU.

    I don't think there's really a latency issue with standard CrossFire - if there was, we'd definitely know about it (you'd notice it when you were playing games). Of course, the fact that the data doesn't have to go via the CrossFire interconnect/chipset is a bonus... but since the GPUs are rendering three frames ahead anyway, it's not going to make any difference IMO.
     
  17. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    I think I follow that Tim, but I dont understand why ati would make a 1gb card if it only uses 512mb?

    are you saying that single card crossfire only uses 512mb as well as the x2 version?
    You can probably tell that Im not very technically literate. but I dont understand why it performs on a par or better in most tests Ive seen that 2 single cards in cf mode?
    thats the part im getting lost at.
    Prior to buying the card, I did consider buying two singles but the possibilty of having two x2's in crossfire in the future clinched the choice I made.
    thanks for your reply to the question.
     
  18. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    It's only a 512MB GPU because it's two GPUs and each GPU has its own pool of memory. Everyone still refers to it as a 1GB or 2x512MB GPU though.

    When the CPU sends commands (from the game) to the graphics card, it is sent to both cards in a CrossFire/SLI configuration - this data is stored in both memory pool as one GPU can't access the other's memory pool directly (there's no physical connection).

    The reason the X2 might perform slightly better in some situations is because of the higher core clock. There is a 50MHz increase on the core (825MHz vs 775MHz), but the memory loses 450MHz (effective, 1800MHz vs 2250MHz)... In memory bandwidth limited scenarios, the memory clock is the reason why the 3870 X2 can be slower than the 3870 CrossFire config.

    Hope this helps to explain it a bit more. :thumb:
     
  19. Tyinsar

    Tyinsar 6 screens 1 card since Nov 17 2007

    Joined:
    26 Jul 2006
    Posts:
    2,287
    Likes Received:
    28
    Perhaps I missed it - but, since I still can't see it, my question remains.

    In 3D mode does the system:
    1) see it as a single GPU,
    2) while using the power of both,
    3) AND powering 2 monitors?

    If yes then it's a Huge step up from the 7950 GX2. That might would also mean that (span / stretch mode is possible - in XP at least - in my perfect imaginary world I could do this on 64bit Vista but it looks like that isn't, & won't be, possible.)

    Thank you very much. I look forward to hearing your results.
     
  20. bustamove

    bustamove A Dremel is a wotsit.

    Joined:
    11 Jan 2008
    Posts:
    11
    Likes Received:
    0
    Thanks again Tim that explanation was clear enough for my limited technical brain.

    Tyinsar, I will try what you suggest later today and see what gives on that.:thumb:
     
Tags: Add Tags

Share This Page