![]() |
Jiangxi Ship Electronics Co., Ltd.
|
Gold Index: 22508
Jiangxi Ship Electronics Co., Ltd. [China (Mainland)]
Business Type:Manufacturer
City: Shangrao
Province/State: Jiangxi
Country/Region: China (Mainland)
Ever since HDMI-capable devices started to come onto the market a couple of years ago, there have been a lot of questions--and a lot of misconceptions--about HDMI and HDMI cables. A "FAQ" on this subject, as we’ve found in trying to assemble one, would be so long that the better way, we think, to provide some answers is to simply address the major question groups: What is HDMI, anyway? What’s in an HDMI cable? Why might I, or might I not, want to use an HDMI Cable as opposed to, say, component video? What makes one HDMI cable better than another, and when does it really matter?
So, What is HDMI, Anyway?
HDMI stands for "High-Definition Multimedia Interface." The HDMI standard was developed by a consortium of consumer electronics manufacturers and content providers, to address what, from the content-provider industry’s standpoint, was a serious problem: existing analog video formats such as component video are not easily copy-protected. HDMI, being digital, provides a perfect platform for the implementation of a copy-protection scheme (HDCP, or "High-Definition Content Protection") which enables the content providers to limit the consumer’s access to, and ability to copy, video content.
As we’ll see, HDMI is a horrid format; it was badly thought out and badly designed, and the failures of its design are so apparent that they could have been addressed and resolved with very little fuss. Why they weren’t, exactly, is really anyone’s guess, but the key has to be that the standard was not intended to provide a benefit to the consumer, but to such content providers as movie studios and the like. It would have been in the consumer’s best interests to develop a standard that was robust and reliable over distance, that could be switched, amplified, and distributed economically, and that connects securely to devices; but the consumer’s interests were, sadly, not really a priority for the developers of the HDMI standard.
The HDMI format is essentially a digital version of RGB analog video; the principal signal in an HDMI cable is carried on four shielded twisted pairs (yes, just like a CAT5 cable, but with shielding added), one of which is for red, one for blue, and one for green. Sync pulses, which tell the display where a line or frame ends or begins, are carried on the blue line. In some cases, rather than RGB video, HDMI carries Y/Pb/Pr "color-difference" video, which represents the same information as RGB but differently conveyed. The fourth twisted pair carries a digital clock signal, and seven miscellaneous additional conductors carry some signaling and incidental functions.
HDMI is often the handiest way to connect two devices; at the moment, that’s really the best reason to use it. However, in the future, it may become necessary to use HDMI connections with certain devices, or certain recorded media, in order to get full HD content. Beyond that, there aren’t a lot of compelling reasons to use HDMI as your connection method. Most of the arguments we hear are based upon common misconceptions about the benefits of HDMI, and one really needs to get past those to understand just what the real reasons to use--or not to use--HDMI are.
1. "Only HDMI carries High-Definition Signals." Wrong, wrong, wrong. Analog component video and RGB both support high-definition resolutions, and what’s more, they’re more robust and dependable over distance. There likely will be cases, in the future, where high-definition signals are available from certain source recordings only through the HDMI port, and only downconverted standard-definition video will be available on analog outputs. However, as of this writing, none of the recordings available on high-definition disc formats have the "flag" set to limit HD output to HDMI. Some "upconverting" DVD players will output their upconverted signals only on HDMI, but the value of DVD-player upconversion ranges from dubious to clearly negative as in most cases it only adds an additional rescaling step to the signal chain.
2. "HDMI provides a pure uncompressed HD signal." This is one of those statements which is true if taken in a wholly irrelevant sense, and untrue if taken in its only meaningful sense. Unless you work in a video production facility, chances are that you’ve never seen uncompressed HD video. That’s a shame, because it’s gorgeous; side-by-side comparison with, say, an ATSC broadcast signal or an HD-DVD signal can be a rude awakening, and just serves to highlight how heavily-compressed and artifact-laced all of the HD video sources we view are. No broadcast, and no recording medium, on the consumer market provides uncompressed HD video, and none are likely to do so in the near term.
So what is meant by the assertion that the HDMI signal is uncompressed? What this too-often-repeated statement actually means is that the signal is not further compressed when it is translated from its source format to HDMI. But the same is true of all source-to-display baseband video formats; component video and RGB are not compressed after the signal is decoded from a DVD or a broadcast signal. The assertion that HDMI is "uncompressed HD video" means only, then, that HDMI is no worse in this respect than any competing video format.
3. "When I use a digital source, I get a pure digital-to-digital signal chain using HDMI." This, again, is true in an essentially meaningless sense, and untrue in the sense in which most people actually understand it. The assumption behind the statement is that the signal flows, unaltered and without degradation, from a digital source to a digital display without ever being converted, and that by eliminating these conversions--specifically, digital-to-analog conversions--one gets a better picture. But the HDMI signal is not the same as the signal recorded on a DVD, or sent in an ATSC or QAM transmission; all of those are compressed formats which encode video in an entirely different way from HDMI. Accordingly, to get from the one to the other requires decoding and conversion. In every case, the signal is decoded and rendered as a video stream. If the original signal is in one resolution, and the output format is in another, the image will be rescaled; if the original signal is recorded in one colorspace and the output format is in another, it’ll be converted. There is nothing inherently perfect or error-free about digital-to-digital, as opposed to digital-to-analog, scaling and conversions, and some things -- scaling, in particular -- are often more easily done well in the analog domain than in the digital domain.
So, yes, a DVD player putting out an upscaled HD resolution through an HDMI cable into a plasma display is an "all-digital" signal chain--but it’s an all-digital chain in which the colorspace is being converted, the original signal is being decoded and converted to another format, and the image is being rescaled not once, but twice along the way. Is doing this digitally superior to doing it in a chain that involves analog conversions? In any particular case it may be, or it may not be, but there’s no reason in principle to think that it necessarily will.
4. "Because HDMI is a digital signal, it doesn’t degrade when run over a long distance like an analog signal does, because it’s just ones and zeros." Yikes! Not true at all. To explore this issue calls for a bit more detailed discussion.
First, it’s true that if a digital video signal stays intact from one point to another, there’s no degradation of the image. The digital signal itself can degrade, in purely electrical terms, quite a bit over a distance run, but if at the end of that run the bitstream can be fully and correctly reconstituted, it doesn’t matter what degradation the signal suffered--once that information is reconstituted at the receiving end, it’s as good as new.
That’s a big "if," however. Ideally speaking, digital signals start out as something close to a "square wave," which is an instantaneous transition from one voltage to another; these transitions signal the beginnings and ends of bits. (In practice, such transitions aren’t strictly possible, and trying to achieve them can generate harmful noise; consequently, high-order harmonics are usually filtered out which results in the wave starting out squarish but not-quite-square.) A square wave, unfortunately, is impossible to convey down any transmission line because it has infinite bandwidth; to convey it accurately, a cable would have to convey all frequencies, out to infinity, all at the same level of loss ("attenuation"). What happens, therefore, in any run of cable is that a digital signal starts out looking relatively nice and somewhat square, and comes out the other end both weaker and rounded-off. The transitions that mark the edges of bits get smoothed and leveled to the point that, far from that ideal square wave, they look like relatively gentle slopes. Portions of the signal lost to impedance mismatch bounce around in the cable and mix with these rounded-off slopes, introducing an unpredictable and irregular component to the signal; crosstalk from the other pairs in the HDMI bundle also contribute uneven and essentially random noise. As a result, what arrives at your display doesn’t look very much like what was sent.
Now, as we’ve said, up to a point, this won’t matter; the bitstream gets accurately reconstituted, and the picture on your display is as good as the HDMI signal can make it. But when it starts to fail, it starts to fail conspicuously and dramatically. The first sign of an HDMI signal failure is digital dropouts--these are colloquially referred to as "sparklies"--where a pixel or two can’t be read. When these "sparklies" are seen, total failure is not far away; if the cable were made ten feet longer, there’s a chance that so little information would get through that there would be no picture on the display at all.
The shame is that, with HDMI, this is prone to happen at rather short lengths. When DVI was first introduced (same encoding scheme, same cable structure, but a different connector from HDMI), it was hard to find cables that were reliable in lengths over 15 feet. The fact that these multipin cables aren’t economical to manufacture in the US and so were being produced exclusively in China, too, didn’t help; Chinese cable manufacturers are very good at keeping costs down, but not the best at keeping tolerances tight. Today, a good HDMI cable can be relatively reliable up to about 50 feet, but because different devices tolerate signal degradation differently, it’s impossible to say categorically that a 50 foot cable will work; it’s only possible to say that it will work with most devices.
Why is that? Well, it all has to do with bad design. The designers of the HDMI standard didn’t really think much about the cable problem, and it shows. This topic is fairly complex in itself, so we’ve split it out into a separate article: What’s the Matter with HDMI Cable?
Analog video signals, contrary to what seems to be the conventional wisdom in home theater circles, are extremely robust over distance. We have run component video for hundreds of feet without observable degradation; the bandwidth of precision video coax, rather than being horribly overtaxed like that of an HDMI cable, is greatly in excess of what’s needed to convey any HD signal. It is true that an analog signal degrades progressively with length; but that degradation, in practice, is so slight and slow that it rarely gives rise to any observable image quality loss in home theater applications.
5. "An HDMI connection is always superior to an analog component video connection." Not so, for the reasons we’ve addressed above. Further, we’ve noticed that it’s not at all uncommon for the HDMI input to a display to be calibrated very differently from the analog inputs. One plasma display we set up looked very bad when fed an HDMI signal--scenes became washed-out and slightly greenish, and the black level was set all wrong so that high-contrast scenes really had no black to them at all, just a sort of muddy-gray color. After some display tweaking, we were able to rehabilitate the HDMI input so that it looked as good as the component video input--but depending on what calibrations are available to you, how your display’s been set up, and ultimately perhaps upon some subjective aesthetic considerations, it’s not necessarily always going to be possible to get your best picture out of an HDMI input. Whether it looks better, or worse, than the component video input in any particular case will depend on the source, the display, the calibration of the source, the calibration of the display, and, ultimately -- since these matters can be somewhat subjective -- your judgment.
One note: HDMI will almost always look better than an s-video or composite (not component!) video input. S-video and composite video are both limited to 480i resolution, and do not render color as well as a three-color format like component video or HDMI.