The Battle Of The X800s

Discussion in 'Article Discussion' started by Tim S, 22 Jun 2004.

  1. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    They should be the same tbh... I don't have the cards anymore, but I'm pretty certain they followed the same pattern as my 9800PRO
     
  2. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    So why not look at the collection of backside photographs and take one of the 9800 for comparison? ;)
     
  3. fathazza

    fathazza Freed on Probation

    Joined:
    20 May 2002
    Posts:
    3,256
    Likes Received:
    16
    # XIII
    # Prince Of Persia
    # URU
    # 14 in 1 games package including:

    * Praetorians
    * Black Hawk Down
    * YAGER
    * Joint Operations Typhoon Rising
    * Heaven & Hell
    * Comanche 4
    * Divine Divinity
    * American Conquest
    * Etherlords 2
    * Commandos 3 Destination Berlin
    * IL-2 Sturmovik
    * Beach Life
    * Splinter Cell
    * Deus EX Invisible War

    gotta say, that game bundle is truly staggering. Almost tempted to buy the MSI one just for the feeling that id be getting that bargain pile of cds with it ;)

    if only i didnt need a whole new comp to go with it
    :miffed:
     
  4. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    HOLY ****! :eek: I just did the math-if you were to purchase all of these games at a retailer, it would cost you $485, +/- $20. The MSI retail card is $400! This video card PAYS FOR ITSELF with the number of games you get! :rock:
     
  5. quack

    quack Minimodder

    Joined:
    6 Mar 2002
    Posts:
    5,240
    Likes Received:
    9
    Now, how do these compare to the latest and greatest from NVIDIA? That's what I want to know :) With the raw power both offerings have, it's getting harder to keep my credit card safe from a huge graphics card purchase.

    Excellent review bigz.

    I like the Best Value/Best Of Test images, the new bit-tech logo is really growing on me.
     
  6. Firehed

    Firehed Why not? I own a domain to match.

    Joined:
    15 Feb 2004
    Posts:
    12,574
    Likes Received:
    16
    I emailed dangerden about this, they said that the blocks are compatible. Last I knew, ATI didn't like irritating it's customers with things like this. Especially with NEW expensive cards which are comparible in price and performance to their competitors new expensive card.

    I like how they use 500MHz memory on these cards even if it comes clocked at... 450mhz I believe. I believe my 9600 has exactly the same deal... although there's some bizarre driver lock preventing overclocking it (same with 9000 and 9200). Things like that are always a bonus in my books... especially if you wanna risk your new ~$400 investment by trying the pro->xt mod which involves some rather risky work.
     
  7. Pookeyhead

    Pookeyhead It's big, and it's clever.

    Joined:
    30 Jan 2004
    Posts:
    10,961
    Likes Received:
    561
    Pretty amazing that a 3800+ is the bottleneck! LOL Would love to see a NVIdia vs. ATi showdown running all the same benches... that would be interesting indeed.

    FarCry's AA setings overiding the Driver's... that could explain a few things I've noticed around here.

    Good review :thumb:
     
  8. Fly

    Fly inter arma silent leges

    Joined:
    31 Aug 2001
    Posts:
    3,763
    Likes Received:
    3
    watch this space! ;)
     
  9. Hepath

    Hepath Minimodder

    Joined:
    20 Oct 2003
    Posts:
    730
    Likes Received:
    0
    Just a quick note...

    ...couldn't see anything about noise that these cards produced from their "out of the box" state.

    I have an MSI Ti4600 and its very noisy....looking at the size of those fans are they too noisy?

    Stu
     
  10. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    When they initially boot up, the fan is very noisy, but then slows down to a very slow and quiet purr by the time it has detected your hard drives on the POST screen. They're considerably quieter than the 5950 Ultra, and I'd say they're quieter than the 6800 Ultra too. If you've seen/heard a 9800XT, you're looking at about the same noise levels. :)
     
  11. Scortch

    Scortch What's a Dremel?

    Joined:
    24 Jun 2004
    Posts:
    3
    Likes Received:
    0
    I don't understand why you chose the old 5xxx series card from Nvidia to compare drivers and image quality with ATI newest generation. Everyone knows the 5xxx series IQ and AA isn't the greatest. Why blast Nvidia using last generation cards? Why didn't you use the 6800, to compare drivers with? That would have been a MUCH more of a fair test since you compred drivers and IQ.

    It looked to me that it wasn't totally an X800 comparison and some cheap shots were taken at Nvidia using old technology against ATI's new.

    I thought this was going to be a comparison of 3 different X800 cards, not a blast Nvidia article.

    You should next time leave out this sort of bashing in comparing the same model cards. If you were going to compare ATI and Nvidia drivers and image quality, at least you could have used Nvidia's new generation card to match ATI's new generation card.

    Sorry if this comes off a little harsh but, I thought it was totally uncalled for to use Nvidia's old generation card against ATI's new generation card to compare drivers and Image Quality. Especially since we all know the 5xxx series IQ wasn't the best.

    Bit-Tech has lost a lot of credibility with me :(
     
  12. RotoSequence

    RotoSequence Lazy Lurker

    Joined:
    6 Jan 2004
    Posts:
    4,588
    Likes Received:
    7
    Might I ask why? :eyebrow: This article was not in the slightest a "Blast Nividia article as you claim it to be. This was meant to be a comparison of ATI's latest tech to the previous generation of hardware. Using Nvidia's latest drivers, we know that the image quality is in fact lower than ATI's in most cases. Does this blast Nvidia? Yes. Is it biased? No. Big Z hasnt had a chance to compare these cards against Nvidia's 6800 series graphics cards yet. That is why you dont see anything about how it stacks up to the new NVidia card.

    This article was in no way meant to be biased. The fact of the matter is that ATI's stuff is just plain better at this point in time when compared with the previous generation of graphics tech. I see no reason to have less faith in Bit-Tech simply because in real world tests, Nvidia did not fair well when you wanted it to.

    Sorry if I sounded harsh about that, but its true. And on a side note, welcome to the forums. :worried:
     
  13. Scortch

    Scortch What's a Dremel?

    Joined:
    24 Jun 2004
    Posts:
    3
    Likes Received:
    0
    The point is, Nvidia and ATI driver comparison should never have even come into the article and discussion. This is especially true since it was supposed to be an article comparing X800s by different companies, not an ATI versus Nvidia article.

    I didn't expect the 5xxx series to win. Everyone knows it's less of a card IQ wise than ATI's. I didn't expect it to win either as I expected to find a X800 shootout, not an ATI vs. Nvidia battle.

    Bashing Nvidia's 5xxx series and comparing drivers between ATI & Nvidia simply should have never been in an article about an X800 shootout.

    I know the old 5xxx series isn't great. I am not saying it was.

    The article should not have been labeled as an X800 shootout, if it was going to be about ATI's new cards and drivers against Nvidia's old cards and drivers.
     
  14. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Welcome to the forums first of all, you may want to hold tight, as this is going to be a long, opinionated post defending my stance on how I've conducted the review, with reasoning of how and why I have done things.

    Initially, I was highlighting an issue within Splinter Cell: Pandora Tomorrow, where I couldn't ignore the poor image quality that the ATI card was showing. ATI are addressing the situation on the bug, and I'll be testing that in the near future to see how well the cards will render SC:pT with the shadowing working correctly. When I came to look at the image quality of this one game, I only thought it was right to show that the image quality on the X800 wasn't as bad over every game title that I'd tested... otherwise basing a conclusion that the cards are worthy of purchasing when theres a big problem with one game and no mention of a problem in any other game without proof, what would you rather see?

    Me not bringing the Splinter Cell: Pandora Tomorrow issue into the limelight, or me providing image quality comparisons for every game that I've tested? I know which one I'd prefer. Both ATI and NVIDIA have commented on the review, and were both happy with the issues/bugs that I'd raised in the article. Benching graphics cards is no longer just about how fast they are, it's about how well they render a game.

    There are numerous occasions where I talk about the NVIDIA card having much better image quality than the X800. The 5950 Ultra can be forced to do full trilinear filtering, whereas the R420 runs with Brilinear filtering (a blend of Bi and Trilinear filtering). This is our first look at the R420, so image quality comparisons are necessary IMHO.

    Halo - the IQ on the X800 is slightly better, this could be down to the 5950 Ultra's IQ being not quite so good... but if you read on through the rest of the pages of the review, you will see that this is clearly not the case.

    The Anti-Aliasing in Far Cry issue is true for the 6800 Ultra too... it is a driver bug, much the same as the bug in Homeworld 2, where the NV cards aren't running with the correct Anti-Aliasing samples. If I had had an NV40 at the time of writing the review, I would have included image quality shots from the NV40 too.

    UT2004 is more than acceptable on both cards, the only difference between the NVIDIA 5950 Ultra and the X800 was the brightness of the lighting in the game. Hardly worth worrying about, just adjust the brightness on your monitor if it is too dark. I play games at high brightness (75% on my OSD) whereas I run windows at 25% brightness - gone are the days of the great gamma that 3DFX cards had.

    In Call of Duty, I comment on the fact that the NVIDIA card is actually rendering the game better than the X800, due to Trilinear/Brilinear filtering differences... take a look at the images again... Call Of Duty comparison.

    Again in X² - the NVIDIA card renders the game much better... X² - The Threat compairison

    I don't need to say more than these two images:

    [​IMG]
    Cat 4.6

    [​IMG]
    Forceware 61.34

    Which is rendering better? I'll let you answer that...

    Next comes homeworld 2 - apart from the Anti-Aliasing bug, I've stated that the NV card actually seems to be rendering more detail. Check the ship (not the harvester or mothership) on the mouse over comparison... I would say the NVIDIA 5950 Ultra is rendering the game in slightly more detail, aside from the AA bug that I've highlighted.

    Splinter Cell: Pandora Tomorrow... I don't need to say anything really...
    [​IMG]
    Cat 4.6

    [​IMG]
    Forceware 61.34

    And again in NFS:U, I state that the NVIDIA card renders the game better, but cannot run in 8xSAA very efficiently... It is my job to highlight the bad points, and 8xSAA is one of NVIDIAs unfortunately - there's a huge performance hit.

    Now, would you rather me not bring the "issues" to your attention and just give you numbers where the games may look like complete and utter crap, or would you like me to give you results with comparisons in the image quality to show the actual game experience that the consumer will be presented with? Actual game experience is where it should lie... If I was to benchmark the XGI Volari Duo V8 Ultra, and didn't highlight the frankly awful image quality, I'd technically be lying to my readership, which we all know isn't right.
     
  15. Scortch

    Scortch What's a Dremel?

    Joined:
    24 Jun 2004
    Posts:
    3
    Likes Received:
    0
    Nope, bringing issues to light is totally cool. There is no problem with that. Just when you do an X800 shootout, I only expected seeing a review on the different X800s by the different companies. I didn't expect to read yet another driver versus driver review, or ATI versus Nvidia review. That's all.

    A seperate review bringing out the issues and such would have been in order if you ran into such problems during your review.
     
  16. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    but you say I slate NVIDIA, when actually I did more ATI slating
     
  17. Ds3

    Ds3 What's a Dremel?

    Joined:
    20 May 2004
    Posts:
    278
    Likes Received:
    0
    Quick question, I notice from above people are guessing the hsf mounting holes are the same as the 9x00 series, I presume this will allow the fitting of an Arctic cooler to the X800? I have an ATI X800 and pushed it as far as 11k in 3dMark03 but it's crying out for better cooling ;) Nice review btw :D

    Cheers.
     
  18. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    yes, the Arctic VGA Silencer should fit on there, HIS have an IceQ X800 which uses the VGA Silencer - I'm fairly sure that card follows the reference design. :)
     
  19. Ds3

    Ds3 What's a Dremel?

    Joined:
    20 May 2004
    Posts:
    278
    Likes Received:
    0
    Cheers for the reply :D

    Have actually just been told by someone that tried it in a different forum that whilst it does fit, the fan plug on the cooler is only 2 pin where as the socket on the board is 3. Not a huge problem but Rev 4 should be out soon and should fit properly, and apparently it is much more efficient with cooling up to 10 degrees better :eeek: Will wait for that I think :)
     
  20. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I can now confirm that the image quality problems that I pointed out in Splinter Cell: Pandora Tomorrow have been fixed. The fix is expected to be included in the next Catalyst driver release, which will be available very soon.

    Cheers. :hip:
     
Tags: Add Tags

Share This Page