I'm just a noob so I have plenty of OC potential and I've read it, but I'm one of those people who doesn't like to mess with things until someone shows me or does it for me the first time, and supervises me while I attempt it myself...
yeah, well you can't really hurt anything with a small oc.. it's when you start messing with the voltages on the video cards- can toast it should add.. if you run or plan on running sli/crossfire, the cpu plays a much bigger role in your performance (and your rig becomes a cash cow! but you can swingaling your peen on a 30")
Just for the record, Harry and I don't rant and rave every time we review a graphics card! I think it depends on your particular setup to be honest. I overclock every bit of hardware that can be overclocked in my system at home just to see if it yields any kind of boost. If it doesn't then I'll leave it at stock (actually that's probably a lie - as my system's water-cooled I tend to max everything out anyway!). But to be honest, overclocking is a hobby for some and a means to an end for others. With graphics cards, this is even more subjective as the gains will be related to the resolution you play at and the games you play. As thehippoz mentioned, you'll only see noticable gains if your GPU is a bottleneck. In games like Flight Simulator X, this is almost certainly not the case, but in Crysis, we regularly see 10-20% increases with cards that overclock well, especially on our Core i7 test rigs. I'd say the main benefit of water-cooling your card is noise reduction. If you overclock it, it will produce more heat and the fan will spin up earlier and quickly sound like a hairdryer. At least with water-cooling, you're greatly reducing the chance that heat is the cause of instability if you're overclocking.
I understand that if the cpu is the bottleneck of the system you'd get no gain from overclocking the gpu - this is fairly obvious. Having said that I can't see that in gaming the bottleneck would ever really be the cpu surely? I mean, if you're getting 150FPS, you don't need to be overclocking anything. Can someone explain more quantitatively why the CPU could be the bottleneck? Also it would be great if we could get some hard FPS increase numbers or benchmark scores for before and after GPU overclocks. What increase in avg framerate are we likely to see? 5? 10? 20? Also, is heat normally the limiting factor for a GPU overclock, or is it one of the clock speeds?
normally, overclocking a graphics card even 100Mhz higher than 600Mhz (say) won't give you any noticeable performance gain. just like overclocking a dual core from 2.5Ghz to 3Ghz, no noticeable gain. as thehippoz said, only when you start messing with voltage, you get good gain, but with the water blocks and the effort you put in, it's better to just buy a better graphics card. i used to overclock my 8800GTX all the time, but due to the lack of performance gain (1 or 2 FPS in Crysis, not noticeable) i stopped. although overclocking the Shader clocks do help Folding a lot. IMHO only overclock the GPU (without volt-mod) if the difference is very noticeable, otherwise buy a better card
FSX is a good example - my last CPU was an Intel E8400 which I'd overclocked to 4.3GHz. While playing FSX, it was practically like running a stress test like prime95 - both cores regularly at 100% load. The heat output was again as if I was benchmarking the CPU. The game is fairly graphically demanding but leans more heavily on the CPU for calculations relating to how the world around you changes, texture decompression and the like. For this reason, you barely see a performance boost moving from an 8800GTS to a GTX280 and SLI and CrossFire usually make it run slower. As such, for a flight sim PC, the first consideration is the CPU, whereas for you're typical gaming PC it's the GPU.
i have clock settings for stock, overclocked and underclocked set up in rivatuner and bound to keys so i can switch clock speeds on the fly. so when im on desktop i'll underclock (300/400) to save power, in older games i'll run stock (602/1100) and knock it up to overclocked (652/1200) when im playing something more demanding. having this set up shows me the direct effect of clock speeds on certain games as i quite often forget to bump up my clocks when starting a game. even at speeds as low as 300/400 a GTX 280 performs admirably in L4D and other source games, in fact on occasion ive played for 20 minutes without realising i havent turned up my clocks. while crysis drops a lot of frames but nowhere near as many as you might think (7-13). so basically overclocking can give you a boost in certain applications but core design, memory interface etc etc are always going to win out in the end
I noticed a large difference between 3.0ghz and 3.5ghz on my e8400, overclocking from 3.0 to 3.1 or 3.2 might not yield much but 500mhz, thats a 16% increase. I run my e8400 at ~4.2 (4146) 24/7 and there is a massive difference between stock and this, my video card is also overclocked: stock: 576/1243/1000 overclocked: 720/1440/1250 there is a significant improved in my performance, my gtx260 c216 is practically the same as a gx280 now, if not slightly faster in some apps. I do run my games at 1680x1050, the low res argument does stand, overclocking a video card when you're CPU bound is silly.