hi all, Hopefully someone might be able to shed some light on this issue. we currently have a computer serving 9 monitors over 5 graphics cards. 8 of these are 1080p native res and work fine one this machine. the single monitor works fine as it is. we upgraded the machine as it lacked ram and power and kept running slow with the information we display on it. when we plugged the screens into the new machine with adaptors but they will only display at a 1600x1200 resolution and i can't seem to work out why. breakdown of the screens is: 6 monitors are VGA 2 monitors are HDMI new graphics cards are: ATI Radeon Pro WX 2100 ATI Radeon Pro WX 5100 x 2 Old server has converters from the graphics card to DVI and then we have VGA adaptors to the cables. The 2 HDMI cables are converted to DVI New server is all Display Ports and we've got DP to VGA adaptors for the 6 monitors and for the other 2 screens we've got DP to DVI. If we plug in one of our monitors from our desks to the dp to vga adaptor we get a 1080 res straight off which to me shows that the adaptors are not the problem I've tried updating the drivers and we've also tried using "custom resolution utility" which worked on 1 screen but once adding more screens they all started flashing to the point the machine was unresponsive. my train of thought is that the 8 monitors don't like talking to the new graphics drivers correctly and thus won't give a native res of 1080p. the only thing i can think of is that we would have to re-cable the screens to use HDMI. was hoping not to as it means pulling them off the wall to do so. any ideas on what else to try?