So I have recently build my first rig, woohoo, and I am reusing my old monitor. It's an AOC 24", and it only has a VGA input (despite being 1920x1080, which I find peculiar). I'm connecting my monitor to my Radeon 7850, using a DVI to VGA adapter, and it seems to work fine. However, there seems to be a constant stutter on the monitor. At least, I believe it is stutter - it's similar to the lines you have on a camera when you try record something from a TV screen. It is less noticeable while browsing the web, but it is still there - it is definitely noticeable in games. Now, before I weep since I can't afford a new monitor, is there a fix for this?
Have you checked your cables are secure? Can you check your pc with a different monitor or your monitor with a different pc? This should help to narrow down the problem.
The VGA port wasn't securely fastened in at the back of the monitor, doh. I'll test out a game, see if the stutter is still there.
Nope, it's still there. I can't test my computer out on another monitor, all I have is this and a HDTV.
The problem is your adapter. Because of the metal visible plugs, it catches interferences. You can try a better VGA cable which could help, by having less interference received from the wire. Or a different adataper.. maybe with gold plated connectors, but the problem will still persist. The real solution is to either: 1- Lower your screen resolution as 1920x1080 is pushing the VGA abilities, as you are seeing. 2- Move to a location with less interference. 3- Or the best, get a monitor with digital input. VGA only on LCD should not exist since day 1. I am even surprised you purchased this mockery. LCD is digital. What you did before was convert digital signal to analogue via the graphic card, using analogues to send the high bandwidth signal to the VGA wire which catches all sort of interferences, and at the monitor, convert the analogue signal back to digital to be processed. It was fine for CRT monitors, as the resolution at the time was low, and if you had a high resolution monitor it usually came with a super shielded and thick VGA cable. And CRT is full analogue. So you only convert the signal once, which mean higher quality than converting it twice.
Thanks for the advice. I got the 24" because it was a measly £40, and a massive upgrade from my old 17" 1024x768. At some point I'll invest in a DVI monitor, but that won;t be until around next year...
Yup, that was a better connector than VGA. Mainly because the wires where more separated than VGA, and you have that each wire is well shielded. However this connector was not wide spread for the consumer market, and video card with it, was rare, unless you used a VGA to BNC based VGA converter.
Could it be a GPU issue? The monitor worked on my laptop - though there was no adapter involved, and the resolution was much lower - 1366x768.
No...well ok it COULD BE, but you solved the problem mentioned above. #1 - you lowered the screen resolution #2 - you don't use an adapter, so less metal part are visible (not shielded) so it doesn't catch a high amount of interference. #3- Your VGA cable is now away from other electric wires behind the desk, which drops further the interference level. So while it could be your graphic card, we won't know for sure because of the above points.