hey just a quick question. hdmi or dvi ? using my current rig i would like to connect it to the tv and play films on it. Which is beter. TV is a samsung 6series 40" also does sound travel through a vga>hdmi adaptor. Cheers
If your graphics card has HDMI out, I would go for that as it carries the sound also... You won't get any sound through vga even if you use adapters..
Both are identical visually, but HDMI carries sound (i.e. the extra bandwidth is for sound, not for visual). So if the TV doesn't have speakers, it'll make no difference. While you're at it, you could look into the world of enthusiast cables
well as my card doesnt have hdmi out and the vga to hdmi adaptor doesn't support sound i reckon i will be using vga to dvi and sound i will use a audio cable. Why does the gpu have a sound input but doesnt output it?
You cannot get VGA (analog signal) to HDMI (a digital signal) in a simple adaptor, it would have to be a scan convertor to transfer the signal from analog to digital or visea versa, this would be an active device, not passive like a DVI to VGA adaptor. gfx cards with HDMI out mostly support the passing of audio via the hdmi port, however thats only if you have connected the digital out of the sound card to the digital in on the gfx card. You also ask which is better between DVI and HDMI, ultimately for image quality there is no difference as HDMI is simply made up of a DVI signal + digital audio.
DVI ports can support HDCP+audio I think, just need to use a dvi-hdmi converter adapter. I've not seen a GPU with sound input on the backplate, but if its on the card the sound input would be so the gpu can passthrough audio if say you had an addon sound card (not inbuilt) or an older motherboard wasn't capable of passing audio from its builtin sound card to the gpu. Same thing as when had to connect a audio cable from soundcard to cd drive back in the day. the below should help if you REALLY don't want to run audio cables aswell. http://www.engadget.com/2010/02/19/atlonas-vga-to-hdmi-adapter-ditches-the-brick-does-1080p-on-us/
Sure you're not confusing the VGA-HDMI converter for a DVI-HDMI one? because i'm pretty sure the former doesn't exist. I'd say use HDMI if you can, as HDMI is the more common AV standard so it'd likely more more useful for future-proofing if you upgrade something in the future
This is what I was getting at tbh, I think some confusion may be the issue here, not necessarily the products themselves.
It's been hinted at in previous replies in this thread, but just to clarify: DVI comes in three flavours, DVI-A, DVI-D and DVI-I. DVI-A is a pure analog connection, e.g. VGA via a DVI connector. I've never seen any device with a DVI-A connection. DVI-D is a pure digital connection and no conversion from analog to digital happens! HDMI is backwards compatible to DVI-D. DVI-D is used on monitors and very rarely on graphics cards. Last, but not least, DVI-I carries both the analog and the digital signal (and again NO CONVERSION between the two), and this is usually the type of connection used on graphics cards. Also, as mentioned before, DVI (or rather the digital part, DVI-D) can carry both audio and HDCP handshake, which is why you can use simple passive DVI->HDMI adapters. This approach was used especially on ATI series 3xxx and 4xxx cards before HDMI connectors on these cards became commonplace.
If you want sound out of the TV then use HDMI. You just need to connect the two pin connector block you get with the HDMI capable GFX card to the motherboards digital/SPDIF out header and the two pin header on the card. If you are going to bypass the TV's sound to play through a stereo then it doesn't matter which one you use as they are the same picture quality.
hey sorry to start this again but im having a few issues with the HDMI cable. Ok let me explain. I bought a cheap 10m HDMI cable from ebay because i didn't think it would be worth paying for an expensive brand one, how ever i have received the cable and plugged it into a dvi-HDMI dongle and found that it does not work. Using a 1.5m HDMI cable does though. i did at first get images over the 10m cable but would then lose signal and go blank. would i need to invest in a better quality cable? i know 10m is a long distance but the next size down is 5m and that's not long enough. (computer one side of room and tv the other) Any suggestions? Thanks
if there's no alternative to the length of cable, I suggest e-bay. What sort of money are you able to spend? You can likely pick up a good second hand bargain on there, provided you pick a half decent brand. How much you can spend will determine what you get but the issue of length over HDMI is a known one - albeit more typical when being fussy about sound and image quality over 15m lengths.
Contrary to popular belief (why shouldn't any cable suffice since the data transported is digital) cable quality actually matters with HDMI. The reason is the massive amount of data being sent across the cable. You might want to get something a bit more high quality. Certification for HDMI 1.3 or higher wouldn't be a bad thing either.
Before you go and spend I had the same problem with my PC when I first switched to HDMI. I knew it couldn't have been the cable because I bought a very high spec stainless braided type so I checked the other things that it might be. Turned out to be the rib on the back of the expansion slots was too close to the GFX cards HDMI output and it was not letting the cable connect properly. It was ok if I pushed it in hard, but slowly the cable would get pushed out by the metal rib until it lost the picture.
Thanks for the help guys. I knows not the conection on the back of the gpu because a shorter cable works fine. I think I'll just have to rearange my room so the tv is closer to the comp. I can't afford to spend any more money. Also I've tried the 10m cable on my DVD player and that doesn't get signal either so I've emailed the seller asking if it can b returned and refunded. Does make me wonder why make cables that long which don't work! Anyways cheers again