Hi All, Is my GTX 570 randomly turning itself off? I can be doing anything from gaming to watching youtube and my system just decided to turn off the video out. I hear the video running as there is sound, most games appear to run again as there is sound available. I've tried, undoing all overclocks, reducing my GPU to minimum speeds, reinstall and clean install of all drivers, alternating memory modules and slots, all the power cables meter within a reasonable degree of accuracy. My current spec everything at stock speeds and memory frequency and timings manually set. Q9550 @ 2.83GHz, Corsair H100 with upgraded corsair fans 8GB Corsair Vengeance LP PC-10700 9-9-9-24 2T Asus Striker 2 Extreme BIOS 1402 EVGA GTX 570 Core 732MHz, Shader 1464MHz, Memory 1900MHz. Stock Cooler Antec Quattro 850W Doubt it's this, but I think I've bragging rights for having a Corsair K70 RGB Rapidfire. Any help would be greatly appreciated. Thanks Ben
There's one thing you haven't considered - the display itself. My U3014 loses the display signal typically a few times a day and it bugs the hell out of me, but it usually only happens when I start the PC or when I load up a game. I can resolve the problem by turning the display off and on again, but it's still a PITA. What display are you using, how old is it, and is it under warranty? It's definitely a possibility.
I'm using a BenQ G2220HD, will try that the next time it happens. Nothing I own in my current build is in warranty except my RAM, Keyboard, and Mouse. Cheers for the reply.... Any other suggestions?
Unfortunately no joy with it being the monitor, it happened about 20 minutes ago and changing between inputs and off and on again didn't make any difference....
Check Event Viewer > Windows Logs > System. Lookout for any errors or warnings for display/driver/power?
I've done this and run who crashed... can't pin point anything that would be suspect... But double check again when it next happens...
List of info... anything you'd like me to expand on, oh and can you tell what time my system crashed. Cheers Ben
i shall get that info for you, however i'm away from that computer for a bit so will do it as soon as i've got back home. its weird because the system crashed about 3 minutes before that error was logged.
Is the monitor connected to the graphics card using a displayport cable? If so, trying a different DP cable is the first thing I'd do. When you turn off a DP monitor, Nvidia GPUs treat it as having been unplugged; you'll get the "device unplugged" notification sound from Windows and everything. I have had huge problems with this on my workstation, which I use Teamviewer to remotely access on a daily basis. One of the monitors is DVI, the other two 4K panels are connected via DP. I have the resolution set to max and the DPI set to 200% in Windows 7 to make things readable, but when the monitors are turned off, the computer thinks that only the 24" panel connected via DVI is connected, and the resolution of that automatically changes until the other monitors are switched on again... So when I remote into that machine with Teamviewer whilst the monitors are off, I'm not connecting to a triple monitor system, I'm connecting to a single monitor system with a low resolution and 200% DPI desktop, so I have to faff about and change settings before I can actually use it remotely, then change them all back again when I'm actually in the office sitting in front of it. It's a ridiculous problem that you will find countless threads on Nvidia, superuser, Microsoft and many other forums about, dating back several years. Nvidia have done absolutely nothing about it because DP is a PNP port and they are too lazy to change the default behaviour for a PNP port in their drivers or software. TL;DR: Check your DP cable and ports.
Thanks for that info. Nope no display port for my my gfx card has a mini hdmi and two dvi sockets but my monitor has a dvi and 15pin vga.
Though could it be DisplayPort internally? I don't know how the extra ports are achieved on this card. Could anyone weigh in? On other cards they tend to add additional ports by internally adapting channels of the DP into other ports (DVI etc).
so my system has crashed yet again on without an error in the system logs. so I'm definitely swaying towards a loss in power to the gfx card. how can I best record the voltage & Current drawn by the graphics card, physically?
Probably the easiest and best way if you can grab another card to test. If it crash with another card then gives you an idea where to look instead.
Okies, so I grabbed my cousin's old card an Asus GTX 780, it's installed so I just need to use my system for a couple of days and see if the display goes blank. Who knows? lol Ben
Well it's done it again... No errors to see and that Asus card has voltage leds and they don't go off... So I'm now thinking it's my motherboard....