Dude. Like. I just need to get this off my chest. I have seen this happen to several customers of mine over the years, and it's happened to me personally on two different machines. It goes like this, see if this sounds familiar: - Do a build update - Congratulations! Your resolution options are now 4:3 only, capping out at 1600*1200. - Check driver date: Windows Update has crammed a recent one in - Roll back to the tatty old driver from c.2015 - Everything works fine again This has happened SO MANY TIMES. Why am I, a really basic semi-educated IT enthusiast, better able than Microsoft to guess at a glance which of two drivers is likely to be stable? Do they even test them before blindly going "lol its newer" and hitting rollout? Granted, this bench rig is old as balls, the VGA is a HD 5570, but this has happened to a customer with a GTX 1000 series too, and someone with a very contemporary gaming laptop. Several gaming laptops actually, they seem to be the worst for it because they often require the OEM-curated variant of the driver to work with all the other OEM software and hybrid VGA switching, not the standard Nvidia/AMD driver, but Microsoft just wang the standard driver in anyway, breaking everything. The common theme is Microsoft don't know wtf they're doing and I desperately wish there was a little toggle in the Windows Update settings that said "do updates, but leave my ****ing display adapter drivers alone thanks". /rant
I thought they were in optional updates? Come to think of it iirc i selected an option not to be notified about graphics drivers or is that something my addled brain just made up?
I don't remember the exact setting and am on the dog and bone but, I have optional updates blocked so, I just get notified.
[frantically searches in settings] edit - all I got is this, does this include drivers for third-party hardware?