for a kettle cable. Well if you've got that sort of cash and you believe that this cable will reduce RFI on your Wank-Fi sorry Hi-Fi, then I have a bridge you might be interested in.
The Advertising Standards Authority has slapped hi-kit supplier Russ Andrews for claiming its super-duper mains cables could reduce radio interference on the power line. According to the the company, its PowerKords reduce noise in the mains supply because they are wrapped up in woven conductors, enabling the company to charge …
The mark-up on cables is already too high, especially with these special "digital" cables and the fact that so many people are miss-sold products is disturbing.
Those that spend £1250 on a mains cable should consider carefully their contribution to the world at large. How could that money have been better spent? Could they have saved someone's life with that money? Could they have improved someone's situation with that money? Was the feeling of possibly eliminating some possible interference so significant that it actually outweighed the well-being and satisfaction of doing something for someone else.
Also, half of these self-proclaimed audio files would have more benefits from changing their room acoustics than just through buying a glorified kettle lead.
...more likely to be an IEC power lead - a kettle lead has a slightly different connector on the end.
And, I imagine the Russ Andrews would be charging £2k for one of those - due to its high power handling in extreme environments!
Mines the one with the bag full of snake oil in the pocket.
The only real difference in your water boiler is the notch at the end stopping you plug a PC one into a kettle (and the fuse and cable ratings).
I work with people that plug in those connectors daily and half of them give blank looks when you say an IEC connector.
I had this conversation a few years ago. It took an age to explain to him that your electricity travels down:
5 miles from the substation in underground cables, probably over 20 years old, to your meter. (Also, connected to everyone else in the vacinity including businesses)
10m or so of the cheapest Mains cable your contractor thaught they could get away with. (BS 7671 compliant if your lucky, but you cant count on it)
5 feet of power cable.
Besides, if your amplifier is so rubbish that it cant properly reject RFI on the Mains, you should seriously consider spending the £1200 on getting another amplifier, so even anything amps you could get for that amount should be able to do that.
Psychics etc can advertise all they want, as long as they can prove what they say in the advert is true. That is usually the point where they fall down, which is why most advertising merely consists of "I'm here" type stuff. Mind you ... Just google ASA Sister Charlotte for the media frenzy that happened when *I* forwarded the ASA a psychic's leaflet. (Anon because I don't want her knowing where to send the curses... lol..)
I'd enjoy listening to their before / after spiel as they tried to justify spending 1200 quid on a kettle lead...
The bull would be phenomenal because lets face it they would have to convince you (as well as themselves) that they *could* hear a difference in order to justify to themselves it was money well spent.
I've got one of these, with a clear plastic sheathing so you can see the braids. It came "free" with a power supply I bought a few years ago. I thought they'd done it because it looks neat.
You know, this might well be the very thing that pushes me into setting up an eBay account. Anyone wanna buy a grand's worth of braided cable...?
If he'd only charged a moderately extortionate sum like £100 -
a) he'd have flogged more of these cables-of-dubious-benefit
b) his disingenuous advertising probably would not have been investigated.
Can't believe people belived it in the first instance. Spend £1000 on a lead that is then plugged into a £3 socket connected to yards of 30-year-old twin-and-earth copper mains.
When can we expect a similar judgement for the guys who charge £1500 for 'Ultra-High Quality' HDMI cables. Ignoring the fact that there is no difference between HDMI cables, I've seen several both online and in stores that claim that they contain some revolutionary new technology that prevents the image data from degrading on it's trip to the TV.
"Ignoring the fact that there is no difference between HDMI cables,"
Can you substantiate that?.......whilst I haven't tried any super-duper HDMI cables, there is a clear and noticeable difference (picture wise) between those cheap HDMI cables that are supplied with some equipment and other cables from specialist brands, such as QED.
It should also be noted that there are many experienced professional reviewers who also notice the difference, UNLESS you are implying that said personnel are in the pocket of certain brands or that they are encouraged to give such reviews to products due to that brands advertising spend....
In the end, it's down to the buyer - if they can see a difference, then it's up to them to justify whatever the price is....
I've had a cheap HDMI cable cause picture distortions. (corruptions NOT qualitative differences)
My theory is that the metal wiring was too thin, too impure and so too resistive to always let the signal be meet the threshold to be detected. I think crosstalk from adjacent wires might have been responsible.
Whilst I agree with all the comments about digital/checksums/etc, it must still be noted that the signal is ultimately analogue, and so can be rendered intermittent, at least, by some effect.
Given that usually logic will require a <20% voltage range to register low, and >80% the voltage range to register high, I can see how resistance combined with crosstalk could lead to a certain wavering of the attained peak voltage for a high, but am not sure about lows.
The problem here being LVDS, I presume the issue would manifest in a more complicated manner - but ultimately result in voltage differences that wavered on the detection threshold.
Alternatively, if the wire quality wasn't consistent for the clock and data lines, there could be a phase shift that would cause data loss - perhaps triggered by some capacitive effect? *
All I know, is that though I have NOT seen a -qualitative- difference, but I -have- seen a signal come through so variably as to cause the odd few pixels or lines to be dropped/corrupted in each frame - with the remainder screen area being received perfectly.
(And I thought any effect was impossible until I saw it - and then took the duff cable to work so a colleague could be equally surprised!)
* I'm not an electronics guy, but write device drivers and use an oscilloscope to check/verify signals on dodgy new hardware enough to have acquired a few ideas.
HDMI cables are like network cables. Cat5 and Cat5e are different. As is Cat6. What is the difference? Bandwidth (the MHz frequency range the cable can handle). Other differences include wire twist methods, conductor quality, and end termination. All of these components help determine what standard the cable can be rated for. Cat5 quality and twist prevents it from handling Gigabit frequencies properly, and likely your NIC will limit you to 100Mbit. The frequency (bandwidth) range required exceeds the cable's ability. HDMI cables have similar issues, where they're rated for HDMI 1.3b, 1.4a, etc. Granted, you can get a cable that, based on that hard-kink you put in it to sit your TV or Blu-Ray flush to the wall, will compromise the cable's already-mediocre build and cause it to down-grade the signal to the point of only being able to pass 720p. Cable length is important also, since 1080p frequencies across a poor-quality cable will likely require a shorter length, just like Cat5e is not recommended for longer than 100 meter runs (even though it may actually work "well" for 150m in your situtation, lucky you).
With all this in mind, if you're buying a 6ft HDMI cable, likely ANY cable you get will run at it's RATED spec (1.3b cables have no hope of running 3D Blu-Ray, for instance. That's what 1.4 cables are for), unless your cable is defective (or you broke it). Defective rates or User Error is outside the scope of this retort.
So, for those that missed the point, here's a summary:
HDMI is a digital spec. It will auto-negotiate the best quality the cable can handle in the given situation. If you or your cable is a numpty, you may only get 720p when you were hoping for 3D Blu-Ray. Read the cable's spec. Most are likely 1.3b, which can handle 1080p, but not 3D Blu-Ray. No, your HDMI cable from 1.1 days won't handle 1080p.
The "it's digital" people don't acknowledge the auto-negotiate side of the equation, which is what the "high-quality cable is important" crowd is likely seeing.
"(1.3b cables have no hope of running 3D Blu-Ray, for instance. That's what 1.4 cables are for)"
Absolute, complete and utter cobblers.
This comment shows you have little understanding of the difference between the varying HDMI definitions, or that there is no such thing as a "1.3b cable" or a "1.4 cable". Referring to a cable as "1.3b" or "1.4" [compliant] is *utterly* meaningless, and is just marketing hog-wash/bullshit.
There is NO difference whatsoever between a HDMI "1.3" and HDMI "1.4" cable, in fact you should never see HDMI cables defined in this way, for good reason. ALL HDMI compliant cables from the very inception of the HDMI standard - from back in 2002/2003 until today - are, by definition, considered to be HDMI 1.4 compliant (and HDMI 1.3, 1.2 and 1.0). And that includes the cheap cable you get bundled with every HDMI device, and any and every HDMI cable you may already own.
There are only four official HDMI cable types, defined as follows: Standard Speed, High Speed, Standard Speed with Ethernet and High Speed with Ethernet. Ignore any mention of "1.3b" or "1.4" in relation to HDMI cables, as it doesn't mean a thing and would only be used by someone who doesn't understand the nature of the product being discussed.
Your "1.3b cable" may actually qualify as a High Speed cable in which case it should display 3D Blu-Ray from a HDMI 1.4 source without any problem. Equally, your "1.3b cable" could rate only as a Standard Speed cable in which case it would not have any hope of displaying 3D Blu-Ray (and may even struggle to pass regular 1080p). The point being, HDMI cables are not categorised by the "version" of the HDMI specification, merely their bandwidth carrying capacity - a cable is either a Standard Speed cable or a High Speed cable, not "1.3b" or "1.4".
And this is where the construction of the cable becomes important, as good quality cables will, on the whole, be able to carry higher bandwidth signals than less well made alternatives.
High Speed cables were introduced to support the new features added by HDMI 1.3 (deep colour etc.), but their physical construction remained unchanged from HDMI 1.1, the only difference being better quality conductors, insulation, construction and connectors, all of which which combine to allow for higher bandwidth signalling.
The only physical modification to the construction of HDMI cables occurred with the introduction of the "with Ethernet" optional feature ushered in by HDMI 1.4.
"with Ethernet" uses previously unused conductors that are now converted into a twisted pair so if you want "with Ethernet" then you'll need new "with Ethernet" HDMI cables, but if not then your current High Speed cable will be fine. You'll only be buying a "with Ethernet" cable if it says "with Ethernet" - if it only says "HDMI 1.4", give it a wide berth as for all you know it might be a Standard Speed cable.
If you have purchased good quality "HDMI 1.3b" cables that have plenty of headroom for future bandwidth upgrades, then these cables most likely fall into the category of "High Speed" and you will be in good shape to watch your 3D 1080p Blu-Ray video(s) from a HDMI 1.4 source.
As for differences in HDMI cables in general, well yes of course they do exist - a well constructed cable will successfully carry a high bandwidth signal over a longer length of cable than an inferior alternative (eg. at 100 feet, 1080p may be possible with a good cable while only 720p is possible with a less well constructed alternative).
However for short runs it's very, very unlikely that there will be a noticeable difference unless one of the cables is spectacularly bad verging on defective - either the cables will work, or they won't.
As stated a cable either meets the Cat5 and Cat6 standard or it doesn't, but I have come across shops selling "high quality" cat5 or cat6 cables for home entertainment purposes. Apparently if you wire your home entertainment network with these you'll get better sound and picture (yeah, right). Of course this misses the point that most people run this stuff over wireless. How long before these dodgy shops start selling "special" wireless antennae for home entertainment use?
"UNLESS you are implying that said personnel are in the pocket of certain brands or that they are encouraged to give such reviews to products due to that brands advertising spend...."
I don't think that needs to be implied, it can be taken as read. HDMI is a digital standard and as such the quality can be very easily measured. eg does the pixel at point x,y have the same RGB value with this cable and that cable? Are there any errors reported from the deserialiser? What does the eye diagram look like?
Then people will claim 'oh but what about jitter?', but its digital video so pixel x,y always goes to the same place and the pixels always come down the wire in the same order.
Of course no-one would do these tests as it would show up their cables. Expect these guys:
Does thereg look better if you use a more expensive DVI-D cable?
as good ol joules watt was fond of pointing out in the wireless world...
digital signal... aint no such beast in the real world
your 0 and 1 are represented as voltages with asscociated currents dependent on various impedances. And those impedances will have different effects on the various (odd No.) harmonics making up your 0 & 1 signal, softening an edge here, overshooting there. And as such are subject to the same laws of phsysics as an analog signal in a 'analog' cable.
that said 'audiophile quality' anything is ALWAYS bollocks.
these people are morons and deserve nothing but our scorn.
It's quite simple, HDMI is a digital interface. In fact, if you are using a commercial Blu-ray disk, most likely it's an encrypted digital link. Either the TV can recover the digital signal or it can't, and any faults will be obvious ones in picture break-up. Indeed, if the encrypted signal was not arried perfectly (with any built-in error correction), the signal would not be recoverable at all.
Of course there are advantages to higher quality cables, like robustness. However, as long as it meets the appropriate HDMI standard for the resoution in question (skew, attentuation, cross-talk etc.), it will either work, or it won't. Higher quality cables may be capable of longer runs, but they won't improve picture quality.
Note the above does not, of course, apply to SCART or other analogue cables.
Human beings have a wonderful ability to convince themselves that they are seeing a difference which doesn't exist, even if it is technically not possible. It is generally related to the amount of money that they've just spent, and it's a charactestic exploited ruthlessly by vendors.
>there is a clear and noticeable difference (picture wise) between those cheap HDMI
>cables that are supplied with some equipment and other cables from specialist
>brands, such as QED
The nice thing about your post being anonymous is that I can explain what a complete idiot you are without needing to worry about pesky libel laws. Note I'm prepared to use my real name here.
You seem to be unaware that HDMI uses a digital signal. There is no degradation of signal because of cable quality, it simply arrives correctly, or incorrectly.
So, unless the cable is of such poor quality that it cannot transmit a simple digital signal without error, the signal you receive will either be correct (ie it will show the picture), or it will be incorrect (ie it will show corruption or nothing).
You won't get sharper skin tones, richer sounds or a more realistic plot in your movie just by using a cable that's fifty times the price. All you'll do is make some guy a little wealthier, at your expense.
>UNLESS you are implying that said personnel are in the pocket of certain brands
>or that they are encouraged to give such reviews to products due to that brands
Corruption in the publishing world? Oh, that would never happen!
No, more likely they're just gullible people like you who are convinced they can notice a difference where there is none.
It's not rocket science. To test a digital cable for quality, you compare the source input with the destination output. Run a checksum on them, if they're the same, the cable is OK. If not, throw it out. That is the ONLY way to review a digital cable.
If you're reviewing a digital cable by letting "experienced professional reviewers" watch the video and making a subjective decision about which cable is best, you're doing it so fucking wrong that the only possible explanations are either financial reward or complete ignorance.
In any case, I think you're on the wrong website. This is a technology website.
HDMI is a multi-lane low voltage differential serial digital protocol with forward error connection (like PCI express or SATA). It'll either work perfectly unless the cable is so crap or broken that it doesn't.
"It should also be noted that there are many experienced professional reviewers who also notice the difference, UNLESS you are implying that said personnel are in the pocket of certain brands..." Yes that would certainly appear to be the case.
"there is a clear and noticeable difference (picture wise) between those cheap HDMI cables that are supplied with some equipment and other cables from specialist brands, such as QED."
No, there isn't. There can't be. It's an error-corrected digital signal, it either gets there & gives you a 100% picture, or it doesn't get there and gives you no picture. Any differerence that you imagine you're seeing is entirely due to your subconscious desire to believe that you got something for whatever ridiculous amount of money you spent on it.
Some have commented that as the HDMI signal is digital that perhaps all HDMI cables act the same and hence you either get a signal or you get a corruption of data (which may or may not be resolved by corrective action from the 2 pieces of hardware that are being connected together).
There is evidence to support the fact that:
1) Most HDMI cables are different to each other and hence although they all pass the relevant HDMI specification, it is likely that some will work better than others, with less losses in the cable, due to the materials used (both for conductors and insulation) and the way in which the cable is constructed.
2) Even if a signal is digital it is still subject to the resistance, capacitance and inductance of the cable itself, as well as how the internal conductors are "wound", which is perhaps why some cables work over short lengths, but the exact same cable, in a longer length doesn't work ...
I would therefore humbly submit that a case can be made for the fact that different HDMI cables can and will affect the signal flowing through them......and hence this can affect the signal received at the "sink" end.
I'll get me coat now......and go back to living under the stone over there >>>>
> likely that some will work better than others,
The physical construction of the cable will certainly affect the signal passing along it, that's basic physics. The point about HDMI being digital with error correction is that any such signal damage is binary: either the signal gets through, and the picture is all correct, or the signal doesn't get though and there's no picture. What CANNOT happen is for the picture to be in some way 'poor' or 'less clear' or 'lower quality' with poorer cables, digital signals simply don't work that way. If you have a picture with a given cable, then you have the best picture you're going to get, and no other cable can make it better no matter how much you pay.
> perhaps why some cables work over short lengths, but the exact same cable, in a longer length doesn't work
No, that's explained by transmission theory. In any cable, different frequencies suffer different delays, and different losses. For analogue signals that will cause loss of picture detail as the cable gets longer, since the high frequencies carry the picture detail and they get attenuated proportionately more as the distance increases. For digital signals the effect of that is to "smear out" the nice square pulses, rounding their corners, until they all run into one another. At the point where you can't distinguish two pulses, you lose your signal. It's often referred to as the "digital cliff", everything is perfect until you suddenly "drop off the edge" and vanish.
It is not true that all hdmi cables are equal, but the thing is that while you can certainly get a rotten signal (drops or no sync at all), once you get a "good" signal you cannot get a "better" signal.
Even a mediocre signal will be "good enough" and hence give the maximum performance due to applied forward error correction. What's more, "the digital" and the FEC effectively hide cable quality so you can't know how good the signal really is beyond "good".
The only leeway would be, once the signal got decoded upon arrival, whether needing to apply the FEC decoder would cause delays in the signal which might somehow be subconciously perceptible or whatnot. <tin-foil-hat>Carefully modulating the errors to trigger the FEC decoder might even carry subliminal messages.</> But somehow I find that a bit unlikely. But hey, maybe you can build it and sell it to the government.
>> I would therefore humbly submit that a case can be made for the fact that different HDMI cables can and will affect the signal flowing through them......and hence this can affect the signal received at the "sink" end.
Yes, but as pointed out above, this is digital not analogue and you clearly don't know the difference. Corrupting the digital signal will create errors, not degrade the quality of the image, but create errors in the bitstream. The result is highly visible "blocking" of the picture - not a degradation of quality.
If you aren't getting this blocking then the cable is good enough - and no amount of improvement in it's quality will have any effect on the picture or sound.
Yes, as already said by other, a cheap cable may in fact not work over longer distances that a better cable would. But for a given cable, when you plug it in it will either work - or it won't. If it's marginal then there will be a picture but with blocking at random intervals - which means it's not working. it won't (for example) make the picture fuzzy, or add snow, or any of the analogue effects that these high quality cables are claimed to avoid.
You see, most people would not spend even £100 on a kettle lead, regardless of whether we believed the claims or not.
However, people with too much money (like celebs, heiresses, people who spend vast sums on "art", especially where the artist is still alive) would also not spend £100 on a kettle lead because it is both too expensive and not expensive enough. But you tell them it is worth £1200 then they believe that firstly it must be good, and secondly that their other heiress friends will laugh at them if they haven't spent £1200 on a kettle lead. The kind of people who would buy these things are not going to be financially troubled by wasting a few thousand quid every time they buy a new stereo.
You can also guarantee that the people who complained about the add would never spend £1200 on said cable, even if it was conclusively proved to give sound quality guaranteed to induce orgasm in listeners in under 30 seconds. Similarly those who would have bought one before would not be dissuaded by the ASA rulings. Just look at the MMR jab, New Coke, Homeopathy, psychics and the X-Factor viewing figures as examples.
that there is some good common sense behind the idea of making, what is essentially a shielded mains cable. By preventing the cable acting as a "receiving aerial" for most if not all RF signals, then this could be a good idea - esp if those signals are produced locally (ie wireless routers, mobile and cordless phones, laptops etc).
But of course there would be lots of RF on the mains already, picked up from before the mains entered the consumers house, so just shielding a cable won't stop that.....
So, for over a grand I could get a very good quality UPS/mains filtering system, and install that just for the ring where the audio system is connected to, which would do a far better job IMHO.
Biting the hand that feeds IT © 1998–2019