1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Sapphire Radeon X1800XL

Discussion in 'Article Discussion' started by Tim S, 17 Oct 2005.

  1. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    Don't know if anyone noticed this but you reviewed on a dual core cpu, I think this is a little unfair as the new nvidia drivers you used take advantage of dual core, whereas the ATi ones don't yet, they will in the near future. So in my opinion this isn't a fair comparison of the GPUs being tested.
     
  2. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    Oh and hello everybody :D
     
  3. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    We actually ran a poll in the forums about which CPU forum members would choose out of FX-57 and X2 4800+ given the choice - the result came out as X2 4800+, which is why you see an X2 4800+ being used. The whole point of these reviews is that they're realistic to the consumer.

    I don't think your point is valid at all, seeing as the test system was set up based on a reader survey that didn't have anything to do with video card drivers. Over 60% of the readers voted for the X2 4800+ and many of those who voted for the FX-57 actually said later in the thread that they actually 'got' what the poll was about and changed their mind.

    You can see that thread here: http://forums.bit-tech.net/showthread.php?t=99273

    Oh, and just because one IHV has support for something that the other doesn't, it means that we shouldn't use it at all? So, when it comes to getting a game that can do FP16 HDR and AA together, we can't test that and compare to NVIDIA hardware because NVIDIA can't do FP16 HDR and AA at the same time through hardware?

    I'm sorry, but if you didn't get the message already, I completely disagree with your opinion and point of view.
     
  4. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    OK, I am new here so I didn't see the poll, please accept my humblest apologies. However you say realistic to the consumer, how many people are gaming on dual core cpus right now? I'd be willing to bet my house on there being way less on dual core than single, so realistic to the consumer would indicate a single core processor, also as I said ATi will be bringing out dual core support as well. I just think it is a little underhanded that you make no mention of the dual core support as being a factor in the performance results, the results may well have been different on the FX-57, I'm not bashing, far from it to be my place as a newcomer on these boards, but I do think that people should know all the facts regarding performance, to use your example, you would alert people to the lack of support or thereof in the review of the HDR + AA. Also, how would you test the performance of HDR + AA on Nvidia cards?
     
  5. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    The review is focused on 'highest playable' - it's not a pissing contest. The fact is that you can't *use* HDR and AA on an NVIDIA card, so you can't take that in to account with highest playable, where you can on ATI's cards. In short, you can't test HDR + AA on NVIDIA cards, but you can on ATI cards.

    That is the beauty about the way we do things - you should be asking that question to the many sites out there who still compare average frame rates from a time demo. It easily fits in to our review format already. ;)

    You could almost say the same about Splinter Cell: Chaos Theory before Radeon X1800 came out, couldn't you? Do I have to benchmark in Shader Model 1.1 on all cards because ATI didn't support SM3.0 and NVIDIA couldn't use the 'ATI' SM2.0 patch just to be (un)fair to both parties? Would you play the game in SM1.1 mode if you spent £200+ on a GeForce 6/7?

    Your question is also true with the FX-57. How many people have an FX-57? And in response to your first question, I believe there are more than you think.
     
  6. mendreks

    mendreks New Member

    Joined:
    9 Oct 2005
    Posts:
    24
    Likes Received:
    0
    Me want one (or 2)
     
  7. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    I think you have misunderstood me, I am aware of the lack of AA + HDR on nvidia cards and I agree with the "highest playable settings" testing model, but as you say with the SC:CT point, all the reviews I have read regarding that game highlight the fact that you could only use SM2 on ATI (patched) rather than SM3 (pre X1000 series) and that it would affect the results of the tests, you didn't highlight the drivers' difference in supporting dual cores, also I said FX-57 as that is what you said was the other choice in the poll, I wasn't referring to user base, it is merely dual core versus single core that I was trying to get across. In my opinion the difference in highest playable settings between all the cards tested may have differed on a single core setup. Oh and you can't deny that there are more single core gamers than dual core, there may be more or less than I think, I never stated a number. :D

    Edited to state SC:CT patched :p
     
  8. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    I've found very little difference between FX-57 and X2 4800+ in all honesty.

    I also upgraded the memory this time around too. That made some games react totally differently (like BF2 for example) because of the extra system memory, but I've not found much of a difference between X2 4800+ and FX-57 when I was playing around with 1GB of memory on an X2. All games that are currently available are all single threaded anyway, so I don't believe that the difference is as big as you're making it out to be.
     
  9. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    I'm not talking about the difference between the FX-57 and the X2, I'm talking about the fact that the nvidia drivers are using the second core for some vertex processing IIRC, whereas the Ati drivers are using only one core for the game. I know that games are single threaded now, but to highlight the performance difference, here is a post at nvnews.

    The Ati card you tested is basically running like the nvidia card in those results with the /ONECPU switch. I would say that those are reasonably significant performance gains.

    Sorry I couldn't find more I'm at work right now, but I have read that the performance gains can be even larger than that.
     
  10. Aphex_

    Aphex_ New Member

    Joined:
    16 Oct 2005
    Posts:
    20
    Likes Received:
    0
    I was just wondering if there was any word on the overclockability of the x1800xl? but if u r having to use beta drivers i am guessing that atitool haven't got anything that will do it yet? would be nice to know what sort of headroom the r520 has in it.
     
  11. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
  12. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    Notice that they're running games that I've not used, and they're also running time demos which do not reflect real world game performance. I'm talking about resolution and detail scaling here, not frame rate scaling. A game is either playable or it isn't and the difference between FX-57 and X2 4800+ does nothing to playability from my personal experiences.

    We're using drivers that are publicly available for NVIDIA hardware from an official source and we're using a CPU that was chosen by the majority of bit-tech readers who chose to vote in a recent poll in the hardware forum. You can't actually get the ATI drivers unless you actually buy an X1800XL. That's the only part of the whole review that is not publicly available and I challenge anyone to prove that the settings that we've reported do not fit inside the boundaries that we set out.

    In fact, the only one that did was on the Sapphire Radeon X1800XL in F.E.A.R. where the average frame rate was 49.5 fps but the minimum frame rate was 18 fps and the game was a lot smoother than on the XFX 7800 GT Extreme scoring 60.4 fps average and 18 fps minimum. As I said in the review, I'd take the Sapphire's smoothness over the higher average (but noticeably choppier) frame rate that was returned by the XFX card any day.

    /ponders Gosh, I must be NVIDIA's PR bitch because I used a dual core CPU. :worried:
     
  13. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
  14. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    Yeah I know but 1Ghz :jawdrop:
     
  15. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    The clocks are impressive, but the XL and XT have different core revisions AFAIK. I don't have the Sapphire card anymore and I didn't have an overclocking tool, so I can't tell you how well a retail XL clocks at the moment...
     
  16. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    I don't know why you keep bringing up the comparison with the X2 and the FX, I don't care about that comparison (merely said FX in my earlier post as it is single core), I'm just saying that you should inform your readers that there is a likely performance gain for the nvidia card on the X2 from the fact that you are using drivers that support dual core from nvidia compared to drivers that don't from Ati, just like readers are normally informed that Ati is using SM1.1 or 2.0 or 2.0+ or whatever which can affect performance and playable settings. Perhaps you could do your own testing to see if there are any real world benefits from the dual core aware drivers? ie whether when you disable one of the cores on the X2 you have to drop some detail? At the end of the day, timedemo or not, higher fps normally indicates that for the same resolution you can enable higher details, maybe go up a level in AA, maintaining the same playability level. I'm not saying you are in nvidia's PR team or anything (really though how much are they paying you?) :hehe:

    I just like to see impartiality, and ommitting the fact that nvidia is using both cores in it's drivers (which I think is fantastic by the way) is not painting a fair picture of what is really happening.
     
  17. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    Looks pretty good so far :rock:
    http://www.xtremesystems.org/forums/showpost.php?p=1075680&postcount=72
     
  18. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
    As I've said I discussed the CPU and left it open to the readers to choose from two choices - I went with what the majority of them said they'd buy if they were given the choice.

    Now you're telling me that I need to be impartial and not use the latest publicly available drivers on one IHV's products in order to disable an advantage that they have because the other team haven't adopted dual core support in to their drivers just yet. When ATI get around to adding dual core support in to their drivers, we'll use them without a doubt. Until then, we're in the "7800 GTX vs no product" situation again.

    What if it takes ATI a year to get dual core support in to their drivers? Long after single core CPUs are (likely to be) history. Anyone buying a system today would be unwise buying a single core CPU unless they aren't going to use it for anything other than gaming. Most people who would drop $1000/£650 on a CPU would use it in their main computer, which also doubles as a gaming machine too.

    Again, there's no doubts that we'll look at ATI's new products in the future when we get samples from other board partners. If and when they release new drivers that add dual core support, we'll revisit ATI's newer dual-core ready drivers with another product review.

    We can only tell it like it is now, and this is how it is.
     
  19. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,881
    Likes Received:
    78
  20. skank

    skank New Member

    Joined:
    18 Oct 2005
    Posts:
    13
    Likes Received:
    0
    I'm not telling you to use any other drivers, I just think it would have been prudent of you to mention the dual core support in the drivers and the possible impact of that, just as I stated before with regards to the ATi cards using SM1.1, 2.0 etc. Uninformed readers may take the performance advantage to be down to the cards alone, without regard for the fact that you are testing on a dual core platform. All the reviews from reputable sites (I can't remember if I read a review from bit-tech but I do count you as reputable) stated the SM situation and explained the impact to gamers. Also I would be dead interested in seeing the performance gains tested as I said, by disabling a core, I'm sure others would be too. :)
     
Tags: Add Tags

Share This Page