Discussion in 'Hardware' started by Yslen, 13 Feb 2011.
I certainly can't.
Any attempt to remove noise will also remove other similar frequencies resulting in less than perfect sound so it's best not to have the noise there in the first place.
Just as an example, the difference between a guitar cable and a speaker lead is mostly the shielding to stop unwanted noise. The guitar cable comes before the amplification so if you were to use an unshielded cable then you'd get nasty crackles everytime you moved the cable which would be amplified by whatever it was plugged into.
Speaker cables don't need this shielding as the signal is already amplified and you won't notice the unamplified crackles.
So once the data has left the device, if it being sent through non-optical cables then there is always a chance that enough interference can change the signal being sent.
But not in digital signals, that's the point of the thread. You get pulses of electricity sent from end to the other, those things arrive of they don't. If they don't, your cable is faulty. If they do, your cable is working. Doesn't matter if it's a £10 cable of a £1000 cable, if it's working then it's not going to have data-loss that isn't corrected.
Because an extreme edition i7 has benefits over a non-extreme version (I assume at least, should be faster, or more overclockable - otherwise those who buy it are idiots). A £100 digital cable will not have any benefits whatsoever over a £10 cable. It can't. It's like saying that having £500 semaphore flags will result in a much clearer signal than £5 semaphore flags. So long as the other dude is watching the damn things and the flags aren't snapped in half, the message will be identical.
Would you care to explain this simple physics to me please, because so far as my understanding of the USB protocal goes, there's error correction built into the handshake packet (ACK/NAK etc.). It's like recorded delivery, if the package doesn't arrive a note is left saying so, and the device driver fixes the problem before carrying on. So while I can understand that a faulty cable might entirely stop transmitting data at times, the idea that a cable will just drop packets and the USB device driver won't notice seems spurious at best and imaginary at worst.
Analogue signals are also pulses of electricity when travelling down wires but yeah if you pay anything over £20 there shouldn't be any noticable difference between that and a ridiculously overpriced one.
Agreed, the wire required to accurately transmit an analogue signal just isn't expensive, you should be looking at under a few quid per yard when you buy it.
This is only true for analogue signals. With an analogue signal the signal IS the data, so any alteration to the signal affects the sound. With digital the signal is just a carrier, so it can get extremely messed up and the sound will not be affected at all. So long as it's still possible to tell whether it's on or off (1 or 0) it's going to work perfectly. Therefore, shielding is not necessary for cables carrying a digital signal, unless you're running them through your PSU or something.
You can of course get noise transmitted along a USB or HDMI cable that affects analogue components further down the line. This happens if you have a ground loop, but those are easy to fix once you know how and the problem goes away. It's got nothing to do with the quality of the cable either, so it'll happen regardless of what is being used if the other conditions are right.
another fine article - comments not bad also
We're talking ideal conditions here, but yes, although you don't perceive it consciously.
The photo-receptors in your eye are even more amazing. In ideal conditions, 1-2 photons will trigger them. Physicists had to build huge complex photo-multipliers to match this sensitivity. Your eye's spacial resolution is pretty crap, but for brightness, it leaves cameras for dead.
Just as our visual systems use subtle differences to create our "view" of the world, our aural systems use these tiny signals to work out what shape the source of a sound is. We can hear the scale and resonance of the sources of the sounds we hear because of these tiny signals.
yes really wow-
The human ear is amazing, but it is incapable of noticing a difference in timing of a nanosecond. To put things in perspective, in a nanosecond sound can travel about 300nm, which is less than the wavelength of visible light. It's generally accepted that we can resolve a time difference between the two ears on the order of 10µs (fundamentally limited by the way signals are transmitted in nerves), which is pretty amazing but certainly beyond the expected jitter of a 300mHz signal (period 1/3 of a ns).
Ok, lets take that down a notch to round v ribbon cables & the relative costs of those, or even TIM.
For whatever reason, I suffer signal dropouts when using the cheapo tesco cables that I don't suffer with the £30 jobs.
With regards to other factors, the effect of dropped packets of data in any line reduce the available bandwidth because in effect what's being "advertised" (so to speak) is the total amount of data able to move from point A to B within a specific time frame. Those dropped packets/transactions/whatever you wish to call them are still part of that bandwidth whether or not they made it. if every packet has to be negotiated twice, that's your bandwidth cut in half....not complicated maths really.
Add in other internal and external line noise, brownouts & spikes, it all adds up. to call the connections between key components the least important part is just a little bit silly. Yes, it's a bit extreme to be spending hundreds on half a metre of cable carrying digital IO when you have no negative factors that require such a clean line, and of course most people will never percieve a difference, but most of us can percieve a difference of under 20 ms and for sure if a screen goes blank with a little orange triangle in the corner saying "signal lost" you have to start looking at stuff like that.
Clear and measurable differences exist between different kind of cables though, as with TIM. Whether the differences are significant is another matter which is down to scientific measurement and what the individual considers small gains important, but at least differences actually exist with these things. With rounded cables obviously there's less obstruction to clear airflow, and as someone who's studied chemistry 3 years at university level I'm just going to say that, quite simply, TIM's are made of different elements, and different elements have measurably different thermal properties.
Then they're faulty. Simple as that.
No indeed, especially not if you've done two years of uni level maths. However the bandwidth required for digital music which is far in excess of what your ears are capable of appreciating in terms of quality is far less than a realistic throughput of USB of say 30MiB's a second. Even with a fairly slow USB controller USB has far in excess the bandwidth you require.
I never called the inter-connects the least important part, that'd be the power cable. What the hell is "external line noise"? Also, brownouts and spikes? The first is related to your power grid, and the second is either crappy amplification or a really badly mastered CD or track.
Anyone with functioning eyes can perceive any change in light input to the retina (although it does take about 50ms for the retina to "process" that data, 100ms for the reptile brain and up to half a second for the conscious brain to react) - however if you're getting signal drop due to a USB cable then your cable is faulty. It's not less good than a £50 one, it's faulty.
LOL, I think we're off at tangents here.
The OP specifically targets £100+ HDMI cables and impact on video signal, 1080p + audio is actually quite a lot more demanding of bandwidth than 44.1khz CD quality audio and in the case of the connectivity between my bd player and tv carrying a bunch of other communications as well, which is the basis of my response - it's not simply about what you might percieve under best case conditions.
Under my telly there's a BD player, a PVR, an RF signal amplifier, a wii, a powerline network hub and a UPS and a multiplug. plus all the cables, obviously. nearby (about 3 metres) there's the inevitable cordless phone and my pc rig. and the noisiest of those cables & devices will be generating line noise that is external to the hdmi cable but likely to be carried through it. every cable or device generating noise accumulates to interference that will degrade the cumulative signal getting to the tv.
The cables I bought from tesco, I bought 3, none of them worked when more than 2 devices were switched on & connected to the tv. So thank you very much but I'll stick to my rather more expensive but fully operational cables that are still by a long way the cheapest parts of the equation (except maybe the power cable)
Audio latency above 20ms certainly messes with my timing when playing through my rig. It's a slightly different issue but if bandwidth is suffering because of poor signal quality I'd certainly expect to notice dropped frames on video. Folk complain about ping times of 50ms when gaming, and that's over a distance of hundreds of miles, why would it be acceptable to have similar latency between two devices less than a metre apart?
"especially not if you've done two years of uni level maths." ? I've made no assumptions about intelligence or potential lack of. If my statement that it's not complicated maths was in any way derogatory or offensive to anyone, I do apologise. bandwidth calcs are something I take for granted that most contributors here would understand, hence my confusion about why anyone is taking a pop at folk prepared to spend more for quality (which in the case of this subject should = greater assurance of bandwidth availability)
I'd struggle to spot any visual difference between a £50 cable and a £500 cable but then I've got poor eyesight, but for sure I don't subscribe to the notion that cheapest = best value because I've got direct experience of that being very much not the case. S'all I'm sayin.
I think the problem is that people are comparing physical quality rather than the quality of the signal.
If you take all interference out of the equation a correctly build £5 cable should give a damn near identical picture to a £100 one, and that is the whole point, a digital signal is a digital signal.
The reason you buy a more expensive cable is for the extra shielding and quality connectors which will last longer and provide protection against interference. That is why I don't buy the cheapest of the cheap cables, but something look looks a little more chunky and well built. But I would never spend more than £20 on a reasonably short cable (2m~)
You'd need to watch a movie which weighed in at about 4.5 terabytes per hour to need more than HDMI. On a functioning HDMI cable running at the peak of it's specification you could watch movies about 90x the quality of blu-ray over one, you feel me? Also I wasn't talking about best case scenarios, I was using real world numbers and situations.
You're going to have to explain the process by which cables nearby a digital cable just magically move electrons from themselves into your HDMI cable, organise it in such a way so as to have several different coherant comm channels all of which just happen to be understood by the HMDI controller, and then as a result cause noise (instead of horrific breaking of everything - or just being corrected by the inherent error correction that's in, well, everything, these days).
Right, so they were faulty, or some other device was faulty. That's what I've been saying all along. Just because some cables you happened to buy from Tesco didn't work and a more expensive one did work doesn't mean there's a difference between the capabilities of functioning cables built to the HDMI required standards. You aren't providing any evidence for your argument here, you're just telling the world you had three faulty cables.
Bandwidth will not suffer because of poor signal quality using digital inputs unless they're faulty. Jesus, have I not gone over this, you've got 90 times the bandwidth you need with HDMI, you're not going to run out of the stuff unless your cable is borked. A £5 functioning cable is equal to a £500 functioning cable in every manner if the two are both HDMI cables. They both just send packets of data in an identical manner, subject to the same error correction, and with the same huge amounts of spare bandwidth available.
Here's the thing, I get (no doubt obviously) annoyed by the arguments you make not because I dislike you or because I think you're stupid, but rather because I think you're wrong, and I think you're either ignoring or not understanding the points I'm making which I'm deriving from knowledge of science and technology to show this. You're not spending more for quality - the only thing that spending more will likely actually get you is a prettier box, and a lower rate of "dead-on-arrival" cables, since quality control should be far higher for a product which costs 10-100 times the price of things which when functioning work just as well. You can't make an argument based upon greater assurance of bandwidth availability here because any functioning cable will have the same bandwidth available. They're just long bits of wire which carry electronic pulses in a specific manner. Wire which can reliably do this costs, as I said previously, at most a few quid per yard after markup and production costs etc.
Indeed, cheapest maybe wasn't the best for you in this case, since you either bought three faulty cables, or bought something else which doesn't work. But that doesn't mean you needed to spend £50, and it certainly doesn't justify all this bull in What-HiFi magazine about some digital cables offering a "more open" "clearer" "more gullible" sound.
One of the companies I freelance design for buy the base items for one of their products for a few Cents per 100 items - they buy thousands. They then add a couple of (equally cheap) add ons to the products, and package it in a very pretty box and is sold as a 'professional tool' for around the 40 euro mark. The exact same item can be bought in any DIY or hobby store for a couple of £.
It's all about marketing, packaging and giving the illusion the customer is buying a superior product - when the hard truth is, the £35 and the £2 product come from exactly the same place.
Yes, it's obvious that you're getting annoyed and I don't understand why.
You are correct, nothing you have posted convinces me I'm wrong, it doesn't even give cause to any doubt. I spend too much of my day justifying my opinions to other technology peers as part of my job, which I've been doing for a number of years now, but in the context of this forum unless we're actually trying to help someone then there really is no need. neither of us is prepared to spend hundreds on cables, I think we're agreed on that, but I don't have a need for you to either agree or disagree with my assertion that there is a sweet spot with the cost of cables - as with many other products.
I understand the points you are trying to make, but I don't have any real desire to get into it at such detail - life is short enough as it is but I don't agree with you. & if someone feels worth in spending what you or I consider to be insane amounts of money for what we percieve to be a placebo then so be it.
My threshold for value is higher than yours - doesn't make my choice of how to spend my income wrong.
@chris, that may be true for some things, but not for everything.
Just saw that article on Cracked and thought of this thread. Scroll down on the first page.
Also, there's a serious lack of sources in some of the ridiculous claims people are making in this thread.
Ha! I knew it! My $20 HDMI cable will kick same amount of ass as their $100+ monster cable! (HDMI V1.3a increases the single-link bandwidth to 340MHz (10.2Gbps) and provides maximum capacity for true HDTV resolution up to 1080P and above) It says so right there 1080P! Look it up
I suppose mostly because I'm easily frustrated. However moreover because I've tried to explain to you that there are serious scientific and technical reasons for saying that a digital cable like a USB or HDMI cable either works or it doesn't, that there is no sliding scale of quality, and you seem to only respond with the anecdote that spending more worked for you. I've provided examples of reasons why this might be the case (higher profit margins and higher cost require higher quality control). This is not an issue in which there are multiple equally valid answers, the science and technology is clear.
I kind of disagree, I see any public discourse as something which educates and informs the general public who read it, and for that reason I'm trying to show not just you, but also anyone who reads this thread, that digital cables are zero-sum thing. They either work or they don't, and there is no middle ground. It is definitionally not possible.
Right, but it's stupid and it's a waste of money, and while I'm happy for the knowingly ignorant to be stupid with and waste their money, I consider it part of being a good human being to show those who really don't have a clue that, in fact, they can ignore the snake oil and just buy something that works.
You see that's where you're wrong, because you're spending more money on something than is required to get an exactly equal experience. When needlessly spending beyond what is necessary you are no longer gaining value.
It is massively true for the interconnect industry.
Separate names with a comma.