I have heard you guys talking about the setting for hertz (hz). What do the different settings do performance wise? Do diff. frequencies use more/less power and is it from the psu or wall? Any info is appreciated.
The higher the refesh rate, the less flicker you get on the image. A higher refresh rate won't effect the speed of your graphics card, but the sharpness of your monitor. Running at 60hz you may see a flicker in the screen, and this will also make your eyes tired. 75hz is the "Normal" refresh rate. Images on most monitors are sharpest at this rate, and you will not get fatigued by the flicker (as fast) 100hz is pretty high, some higher quality monitors can run this fast with a razor sharp image, some will start to get fuzzy. At this rate you can stare at the images all day without any problems. As far as how much power it consumes the difference will be in singles watts (like 80-85 watts)
Yes, people generally run it as high as is supported for the current resolution, as long as the picture stays sharp. One of my monitors reckons it will do 1024x768 @ 100Hz, but the picture is all over the place so I don't run it that high. Monitors will usually run at higher frequencys at lower resolutions, so if you are running at a high frequency and you increase the resolution past what your monitor will support at the current refresh rate windows should automaticly lower the refresh rate to what the monitor will support.
I have a dual monitor setup and both are at 85 hz. The thing is that if you use low frequency refresh rate and look near the screen, you'll see the flickers... personnaly, I don't see why anyone would use a low refresh rate... unless they absolutly want to use the highest resolution...
So does the performance get lower in quality or some way at high resolutions when the refresh rate is really high?
like Zap said... the refresh rate doesn't affect the performance of your graphic card. you can't get to high resolutions with high refresh rate... because the smaller the res is, the bigger the refresh rate is... the bigger the res is, the smaller the refresh rate is... refresh rate is a software setting... not hardware... so it doesn't count in performance... but at high refresh rate, the monitor can be blured of messed up... depends on the monitor...
My monitor will apparently do 85hz and it's great at that refresh, but after a while it starts making a whining noise that gets louder and louder, so I don't want to run it like that for too long.
My monitor does 1024x768@85Hz happily, however I'm a big fan of 1152x864 and it only officially supports this resolution at about 72Hz, and I can't work with it at that resolution. So I used Powerstrip to overclock my monitor to 81Hz at that resolution, it's been running fine like this for months, though I am still worried about long-term damage. Anyone got any ideas? It's a 17" Tatung something.
Gutted that doesn't sound good! Both my monitors happily do 1280x1024 @85hz (or 1600x1200 @85hz if i wanted) without any problems...
My 17" (Iiyama) monitors do 1152x864 @ 100Hz extremely well, and now that I'm so used to 100Hz, anything below that I find difficult to work with.
I agree! Got my Viewsonic 19" P95f running at the same resolution and frequency. Every frequency below 100Hz just seems wrong to my eyes
Both mine are running at 1600x1200 @ 85hz I can't use anything lower than 75hz as it hurts my eyes really quickly and I start to get headaches.