I have an AOC screen, a cheapie, but has a resolution of 1920x1080. I connected it to my Samsung blu-ray player tonight and the system defaults to 480p. When I manually set it to 1080p the screen says no signal and the system resets to 480p. Is this something to do with signal loss at the DVI-D adapter or a DRM issue? I have HD component out on the blu-ray player and what I believe to be SD AV out as well. Since I HAVE the player but not a dedicated TV, I figured it would be simplest just to connect it to my screen, and would rather not have to get a drive for my PC if possible. Anyone have any thoughts? If not, I'm going to call Samsung and AOC tomorrow and find out what might be going on.
So the screen has no HDMI input? I had an issue at work using VGA-DVI adapters and I had to use nVidia control panel to force an output of 1920 x 1080...
That's right, DVI but no HDMI on the screen. There's a manual setting on the player but it just reverts after a few seconds. Le sadface.
i would imagine the problem is one to do with DRM. is your monitor HDCP compliant? im running an adapter cable from the HDMI port on my HTPC to the DVI port on my projector at 720p and it works well.
I have had a similar problem problem for a while now, currently given up. I have been researching this for a while and this is some of what I've picked up, though not necessarily entirely correct. It is to do with, while HDMI and DVI use the same signal type, TMDS I believe. Some of the frequencies/resolutions they are required to support overlap on HDMI and single link dvi ports. Dual link ones do not have to support so many compatibility wise. DVI to HDMI is compatible but not necessarily HDMI to DVI. There are quite a few discussions on this issue particularly with PS3's and many (all?) sony products in general if you look around whether or not the screen has HDCP.