Discussion in 'Article Discussion' started by Gareth Halfacree, 13 Feb 2013.
Includes new 'Advanced' edition.
Look's interesting will run it tonight to see if there is anything different in terms of scores.
It's a little hard to compare results, as the settings are slightly different different from v3. However, seems like it's a fait bit tougher. On my rig-sig with everything at stock i got with the settings as close as I can make out:
1119, 21.3min, 113max and 44.4avg fps on v3
890, 7.5min, 85.7max and 35.3avg fps on v4
There's a couple of places in v4 where the fps bombs for a fraction of a second. Will run again and confirm where.
EDIT: Did 2 further runs, both had a small dip at the very beginning of scene 19 (on my first run it happened at halfway through 18 then start of 19), but nowhere near as dramatic as in my first run. Run No.2 went from 21.2 to 17.4 and run No.3 went from 21.3 to 17.1. First run must have been an anomaly.
If there was anything that could blue screen your overclocked GPU that was stable at windows, this is it
I find V3 was most effective for gaming when clocking the sheer nuts off a Core2Duo E6600 and GTX285 - managed to get 80mhz on the GPU chip stable with circa 230mhz mem whilst performing Uni benchies
I originally set up my gfx card OC using heaven, which hit a max temp of 75ish, even on the new version. However Far Cry 3 has been burning a hole in my gfx card, hitting temps of 90+, but no crashes ingame. If this is a stress test/benchmark, why isnt it giving me the max temp of my card?
What prog can I use to find the max temp so I can clock my gfx card down a bit? I've heard my card clocks down itself (580gtx) with Furmark, so you dont get a realistic result.
Separate names with a comma.