Half Life 2 Performance Evaluation

Discussion in 'Article Discussion' started by WilHarris, 1 Dec 2004.

  1. Arena_08

    Arena_08 What's a Dremel?

    Joined:
    1 Dec 2004
    Posts:
    61
    Likes Received:
    0
    I have an AMD2600+, 1GIG unbranded ram which i am upgrading this month for 1 gig of Kingston HyperX CL2 ram. I have a fx5600 256mb, which i am also going to be upgrading to a 6600GT later this month.
    Not all of us are in your situation, please try and see it from another perspective.
     
  2. rowpie

    rowpie What's a Dremel?

    Joined:
    15 Sep 2004
    Posts:
    14
    Likes Received:
    0
    i'd be curious to see how it would have performed on the systems it was originally intended for when it should have been released all those many moons ago. When 64 bits were rather strange, we all overclocked bartons and the 9800xt was king.

    It would probably need to come under a seperate article though.

    as for mid range hardware levels currently you could argue them for ages. its very difficult to decide what is mid range as even within that there is quite a price gap. the top range is much simpler.
     
  3. play_boy_2000

    play_boy_2000 ^It was funny when I was 12

    Joined:
    25 Mar 2004
    Posts:
    1,646
    Likes Received:
    196
    your system is more then enough to play the game, so GO BUY IT, if you already havn't done so! see for yourself what the performance is!

    I honestly don't know what your complaining about, your system specs arn't that far from the underclocked fx55. CPU dosn't have a huge performance on the game and theres no way bigZ can realisticly represent ever single CPU out there without sacrificeing more time then he already generously provides to post reviews for all you people.

    be thankfull for what you do get...
     
    Last edited: 2 Dec 2004
  4. Reaper_Unreal

    Reaper_Unreal Minimodder

    Joined:
    16 Apr 2002
    Posts:
    380
    Likes Received:
    0
    I've got a question, why did you compare a 9800 Pro to a 6600GT? They're from completely different generations. You should be comparing an X600 to a 6600GT. If you're going to use a 9800Pro, compare it to an FX5800. That would make the most sense.
     
  5. Jamie

    Jamie ex-Bit-Tech code junkie

    Joined:
    12 Mar 2001
    Posts:
    8,180
    Likes Received:
    54
    A 9800 is dx9 whilst FX 5950 and below are dx8.1 so you can't compare them.
     
  6. 0013

    0013 What's a Dremel?

    Joined:
    7 Mar 2004
    Posts:
    374
    Likes Received:
    0
    then why does the box of my 5900xt and aida 32 claim that it had direct x 9.0 hardware support ?
     
  7. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Because they are in direct competition with eachother - ATI do not have an AGP version X700 series yet, because their bridge chip, RIALTO, has not turned up yet.

    http://bit-tech.net/review/371 - that is the exact battle that I have compared here in HL2. If I was benchmarking PCI-Express, (for which I don't have the anywhere near the same array of video cards) I would be comparing 6600GT to X700 XT and X700 Pro, as they are in the same price bracket with each other. :)

    They could be compared, but NVIDIA have a new mainstream AGP-based video card that is available to purchase (6600GT). If 6600GT hadn't been introduced, the mainstream battle would have been between 5900XT and 9800 Pro.
     
  8. Fod

    Fod what is the cheesecake?

    Joined:
    26 Aug 2004
    Posts:
    5,802
    Likes Received:
    133
    it can do dx9, but it's not actually 100% dx9 compliant. dx9 calls for 24bit FPO precision, whereas the GFFX cards use a combination of 16 and 32bit FPOs. if the games are coded with this in mind, there is performance gain, but if they're rigidly dx9 spec, the FX cards will be forced to use 32bit precision all the time, slowing things down.

    needless to say, the 9800 is 100% dx9 compliant.
     
  9. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    It's silly though, because there are many many areas in HL2 that do not require 24-bit precision in order to be rendered correctly. :)
     
  10. barry_n

    barry_n What's a Dremel?

    Joined:
    24 Aug 2004
    Posts:
    338
    Likes Received:
    0
    Bigz the system i run is:

    P4 2.8C (ES) @ 3.3GHz
    512MB Geil Ultra Plat DDR550
    9800Pro 128MB Flashed to XT

    in my opinion thats not much different to your setup apart from the fact you have 1GB ram, going back to previous discussion, i think the mainstream users out there currently are at 512MB ram (soon i will be upgrading to 1gb) maybe in your mid range card battle you could have used 512mb ram. Just a thought :D

    now my query... the game decided for your system it would be best to have high qual models and high qual textures, on mine it decides medium for both, is that purely because of the difference of ram between our systems?

    good article btw :thumb:
     
  11. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    I think that most mainstream gamers are starting to move towards 1GB, which is why I chose to go with 1GB of slightly slower memory on the mainstream system. On entry level (X300, 6200, X600, 9600, etc), I feel that 512MB is still standard, for a gamer, so I will be using 2x256 for the purposes of an entry level video card.

    I haven't actually played HL2/CS:S with less than 1GB of memory yet :blush:

    My main rig has 1GB memory in, as does the video card review rig, at least, as things currently stand. Maybe someone else could confirm your thoughts? :)
     
  12. aCe2k

    aCe2k What's a Dremel?

    Joined:
    1 Jan 2002
    Posts:
    89
    Likes Received:
    0
    It struck me while reading that you didn't test any of the gfx's with an Intel powered board. All AMD sneses with über spec's.. Any reason for this. For if i'm not mistaken, a GeForce performs better on an Intel board as a ATI performs better on a AMD/VIA board.

    At least my experiences have tilted twards those thoughs. :hip:
     
  13. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Not quite sure where you got that from.... NVIDIA produce chipsets for AMD, and have done for a long while. Until now, they don't have an Intel chipset, but that will change sooner or later.

    I don't think the platform makes much difference to be honest.
     
  14. LockmanX

    LockmanX What's a Dremel?

    Joined:
    8 Jun 2003
    Posts:
    387
    Likes Received:
    0
    On the subject of testing on more 'average' systems. I support this suggestion. I love looking at the latest hardware just tear away at the latest game as much as the next guy, but at the same time, I know I probably won't see top end hardware for a long time. Like most gamers, I'm on fixed income. Despite our desire to do so, we just can't upgrade every generation or so. We all know this. Now, I believe this could become a very interesting part of applicable bit-tech reviews, but we still have to address one issue. Just what exactly is a mainstream, average, mid-range system?

    I guess we have to look at this in reference to three main parts: video card, processor speed (mind you, not brand), and ram.

    Some people say the average speed for a processor is around 2.8 to 3 ghz. Others suggests 2 ghz. Not trying to be an authority or anything here, but I personally feel the average processor speed is probably around 2.4 ghz. That puts me in the low end I suppose.

    Ram. Most of understand how vital ram is. I use 512mb of PC2700. I think the size is right but the speed is slow. So, again, personally I feel that 512mb of PC3200 from a respectable brand is probably average. Most everyone I've met at a LAN use 512-784. A good few use 1gb and I agree 1+gb is in most everyone's upgrade plan but I don't think its close enough to a majority to call it the norm.

    Video cards...this is where it can get simple or hoplessly complex isn't it? *sigh* Well, I'l put my opinions up to bat again here. First, I don't think its fair to define just a mid range card here. I know plenty of people with a good or above average system with a rather crap GPU. My lacking of ability to express my thoughts in words is comming into play here :sigh: I suppose when it come down to it, you should just take into account a midrange system with a midrange card, but also a midrange system with a low end card. In establishing references to just where these 'ranges' begin and end is a bit tricky. Though, personally (that word again), I feel that you could call a 9800 Pro midrange and a 9600 Pro a low end (and whatevet thier nVidia equivalents are). I feel the 9600Pro/XT are still capable cards, but obviously not even close to top nocth. Their ability to make even the latest games look decent enough while maintaining playability allows them to serve the "low-end" roll pretty well. A card like the 9800 Pro (and 6600GT) will handle such games alot better, but not as well as high end cards. Again, very fitting of the midrange roll.

    High end parts are easy enough to pick out of a crowd but finding relattion to older hardware is not so easy. I'd still loke to see how well a 9600XT stacks up againsts a 9700Pro.

    In summary, I think including comparable hardware in reveiws is a good idea. Until someone starts to throw around high-end 'toys,' many of us ar stuck with our 9600's and 2200+'s. All you really need to do (aside from the physical logistics face be the reviewer) is establish a standard as to just what is what.

    ...I hope that wasn't to long...
     
  15. aCe2k

    aCe2k What's a Dremel?

    Joined:
    1 Jan 2002
    Posts:
    89
    Likes Received:
    0
    LockmanX, then i have an idea...

    1/2 year old cpu/ram
    1 year old gpu

    it's what me and me m8's have atm. But donno what x-mas will bring :rock: :clap:
     
  16. aCe2k

    aCe2k What's a Dremel?

    Joined:
    1 Jan 2002
    Posts:
    89
    Likes Received:
    0
    It's not what I said :D

    I said that i've experienced GeForce performs better on an Intel (mother)board as a ATI performs better on a AMD/VIA (mother)board.

    Brandwise, this would be Nvidia/Abit (Intel chip) vs. ATI/MSI (via chip)
     
  17. Tim S

    Tim S OG

    Joined:
    8 Nov 2001
    Posts:
    18,882
    Likes Received:
    89
    Personally, I've not experienced that by any stretch of the imagination - the results are pretty similar across the board.
     
Tags: Add Tags

Share This Page