1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Hardware Fermi Testing Update

Discussion in 'Article Discussion' started by Lizard, 31 Mar 2010.

  1. Xir

    Xir Modder

    Joined:
    26 Apr 2006
    Posts:
    5,412
    Likes Received:
    133
    but but but...in the CPU reviews it always says your overclocks are...stable! :D

    couldn't resist

    But seriously, if you see troubles after a prolonged overclock, wouldn't that be worth an article?
     
  2. LightningPete

    LightningPete Diagnosis: ARMAII-Holic

    Joined:
    2 Jul 2009
    Posts:
    307
    Likes Received:
    0
    yeh, whats teh HD470? :p
     
  3. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    Man you have ALOT to learn. Evere heard of Linpack??.............
     
  4. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    The 4870 1gb is probably better at Arma II than the gtx480. Hardocp showed that when they benched with it.
     
  5. Action_Parsnip

    Action_Parsnip What's a Dremel?

    Joined:
    3 Apr 2009
    Posts:
    720
    Likes Received:
    40
    Im having trouble understanding how your overclocks aren't lasting more than a year. You have to have a look at voltage and temps. I cant believe electron migration would occur that quickly unless the temps are on average pretty high or the voltage used is a bit excessive. Whatever passes 4 hours of linpack without creating a burning smell should be good to go for longer than a few months. I dont fold but my rig is on all day going from web browsing to hibernation to full-on gaming and back again on a very regular basis, running multiple titles on varying engines. Finding my max graphics overclock down to the nearest 3mhz one day using crysis warhead only ever saw vpu recovers and no crashes to desktop. My max graphics oc ran stable for 1 hour and 20 minutes until I gave up with boredom, 3 mhz faster on the gpu core didnt make 20 minutes so its nailed-on stable. Now im thoroughly sick of crysis for good.

    Its just one example but this thing runs source games, bf2 and ut3 games for (shamefully) hours at a time. Stability is not an issue, and if anything is gonna find flaky overclocks faster than Linpack, ...its games. Left 4 Dead 1 will root one out in less than 10 minutes, 40+ minutes faster than prime95 did (one scenario I vividly remember).

    Ive got 3.475ghz on this Q6700, linpack load is 82 degrees on cores 2 and 3, gaming it doesnt even reach 60. Vcore is 1.4375 volts. If you have an overclock that billows great gobs of heat, or passes 55 degrees on water cooling on anything except Linpack, then dont fold on it, PERIOD.

    On a side note, cpu power with respect to games, if a title has lots of a.i. and/or streaming going on, then YES the cpu will matter.

    Examples include, GTA IV, S.T.A.L.K.E.R., Dragon Age: Origins and Crysis with the max object detail setting. Probably, no almost definetly put Arma II on that list too. My 4870 can seemingly move mountains compared to what it could do when I had a 2.8ghz 2mb L2 cache Allendale....
     
  6. cybergenics

    cybergenics What's a Dremel?

    Joined:
    27 Jun 2009
    Posts:
    613
    Likes Received:
    17
    I would argue their overclocks don't last because they seem to adopt the more is better, boatloads is best regarding voltages. With respect, over the years (apart when I fried a Mobo trying to duplicate the D805 results) have always had better or equal overclocks to CPC//BT on the same or similar kit, often with way less volts. The recent 930 voltage test drew gasps as to the voltage used on several popular OCing forums, as I have said before, people will take that as gospel and wreck their kit using those voltages long term. Serial upgraders may not notice...as the thermal paste is still 'going off' by the time they upgrade.

    It wasn't until the recent OCing article about various CPU's that I first saw CPC admit that not all Q6600's would clock to even 3.4 (even G0's) when for years they had been telling everyone any old G0 would hit 3.6 'on air'. Maybe so, but most with the stability of me after 6 litres of Tesco value cider.
     
  7. AstralWanderer

    AstralWanderer What's a Dremel?

    Joined:
    17 Apr 2009
    Posts:
    749
    Likes Received:
    34
    Erm, the same number as those who can afford a GTX480? ;)
     
  8. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    intel has specs posted for their chips http://processorfinder.intel.com/

    if your looking for longevity.. stay under the voltage limits posted- I'm sure the engineers who make these things don't pull those numbers out of the air
     
  9. hrtz_Junkie

    hrtz_Junkie Controversial by Nature

    Joined:
    25 Jul 2009
    Posts:
    30
    Likes Received:
    0
    @ Action parsnip +1 for real,

    I still remmeber trying to play oblivion, at the time i had p4 running standared at 3.2hrtz 2gb of ram, and 2 7800 gtx'x in sli,,,,,

    seriosly on that rig the game was unplayable without SERIOSLY reducing the eye candy. then i uppraded cpu to a q6600. I coudn't beliver it i could run the game totally maxed out with 32xaa!! (sli only).

    then i removed one off the gtx's and still the game sped along with 16x aa !!

    the thing is, if the cpu is holding back the benchmark then you're not getting a true representation of the gfx cards potential,,,,,

    cpu's can allways be upgraded.

    I have to admit i'll be reading you're hardware reveiws with a pich off salt untill you can figure out a way off getting rid off cpu bottlenecks!!

    I wondered why many other web sites that i read (who are every bit as methodical as this one, possibly even more so in toff the fact they only do gpu testing therefore have much more time to devote to the process, and who also use a lot more game benchmarks than you) put the gtx 480 ahead off even he mighty 5970?????

    Know I know why -1 bit tech

    (8 or 9 benches would be exeptable, 4 is frankly poor)
     
  10. thehippoz

    thehippoz What's a Dremel?

    Joined:
    19 Dec 2008
    Posts:
    5,780
    Likes Received:
    174
    cause they're smoking crack.. in huangs crack

    bottlenecks maybe more of an issue with these newer cards.. but 7800's XD what are you running for res.. lemme guess 1024x768
     
  11. Baz

    Baz I work for Corsair

    Joined:
    13 Jan 2005
    Posts:
    1,810
    Likes Received:
    92
    /groan

    Very few, of our benchmarks are CPU limited, and the decision to not run overclocked systems has almost no bearing on the results we present. Look at the 1280 x 1024 numbers of Stalker for example, or Dirt 2. Or Crysis. The only ones that are noticeably affected are Dawn of War 2 at the lower resolutions, and that's only because it's an insanely CPU intensive game when things get going. Even then, at high resolutions (1,920 x 1,200) none of our tests are notably CPU limited.

    The decision to not overclock is because (1) we have three of these systems and setting up identical overclocks on all three is a pain. Could we do it? Sure, we've done it before. It's just when things go wrong it becomes a major headache. (2) Running stock gives directly comparable results to what you'd see at home without any overclocking skills on high end hardware. (3) When a clock wobbles, or memory fails or you change a GPU the wrong way, or you unplug the system at the mains, or for any other of a dozen reasons, you need to re-set up the overclock and reset half the bios settings. Things can get missed, the overclock doesn't get set properly, whatever. As we have MULTIPLE test rigs running, this then throws the whole lot of test data out the window as you become unsure which system is set right, when it went wrong, to what extent it went wrong. In the long term, it's just a long list of headaches for bugger all real useful gain. I'll stress it's not because we can;t, it's because it's a colossal pain in the ass for no real gain when it comes to results accuracy. We chose our tests because they ARE GPU limited - it's why we dropped Fallout .

    We know we test with fewer games than other sites. This is because no one bothered to read the results we produced when we tested 8+ games. With 5 games we get a very good idea of how the cards perform across a range of engines and situations - Do you really need to know how this card will run 2 year old games like World in Conflict or Oblivion? No, because of course it will crush though titles. You want to know how it handles new games, DX11 games and Crysis, which is how we tested.

    I'd imagine any test where the HD 5970 is behind the 480 is either DX9 (where the GTX 480 is very quick), or CrossFire driver bugged. Certainly in our tests the GTX 480 doesn't come close.

    Of course, we're just one voice here on the net. If you want to see how the GTX 480 handles two year old games no one plays or runs on an LN2 cooled sixcore gulftown, I'm sure there are sites out there to cater to your tastes. We prefer to keep our results relevant and useful, and I stand by my numbers as such.
     
Tags: Add Tags

Share This Page