Displays and Projectors : Is Having an HDMI Cable Neccessary to view 'True' HD?

Is Having an HDMI Cable Neccessary to view 'True' HD?

After buying a 50" HDTV, our family decided to upgrade our old digital cable box that we've had since 2001, to an HD DVR cable box.

But the picture we're receiving on all of the "HD" channels are not that great and don't look that much different than the regular channels we got using the old standard digital cable box.

The HD box came with only the "Blue, Green, Red" cables, along with a "White and Red" cable; but it didn't come with an HDMI cable (nor did it say it was required).

I'm not sure exactly what an HDMI cable is, or what better (if any) picture quality it can provide; but I would like to know if it's worth the extra 20+ dollars if it means that we'll receive a picture quality that's much better than the one we had before.

I have to be doing something wrong or missing something (like the HDMI cable), because if I'm not, HDTV is vastly overrated.

Re: Is Having an HDMI Cable Neccessary to view 'True' HD?

It depends on what "True" HD is to you.

High definition comes in three general flavors: 720p (720 lines of resolution, progressive), 1080i (1080 lines of resolution, interlaced), and 1080p (1080 lines, progressive). "interlaced" means that only half of the lines are shown at one instant, with either one's eye or one's TV combining the two "fields" into one frame, while "progressive" means all the lines are drawn in the same instant. 1080p is the highest quality, most data-intensive form of HD, and is often called "Full HD."

The industry has dictated that 1080p can only be conveyed through HDMI, which conveys a digital signal directly from the device to the TV set (any other method entails converting the digital signal to an analog one). The "Blue, Green, Red" cables you mention are called "component video" cables; they can deliver up to 1080i.

The thing is, no broadcasters, digital satellite, or cable providers deliver a 1080p signal--it's just too much bandwidth. High-def broadcasts are 720p or 1080i. So unless you've got a pretty crappy component cable, you probably won't be better off with HDMI.

There are a couple of explanations that come to mind as to why you might not be happy with your HD picture. One is that your cable box might not be set to output the signal in 720p/1080i, and may instead be merely delivering 480i/480p (standard/enhanced definition). I don't have digital cable myself, but I know that DirecTV and Dish Network boxes have this property; Make sure this setting is properly set.

Secondly, digital cable and satellite services only have a certain amount of bandwidth to pipe through all the signals that they're offering. This means that they compress the signals, and the resultant picture is not as good as over-the-air HD broadcasts. As technology improves and bandwidth increases, the companies are bound to improve their signals, but for now, the HD picture from satellite and cable are far inferior to OTA broadcasts and high-def media players (like Blu-Ray and HD-DVD players).

Re: Is Having an HDMI Cable Neccessary to view 'True' HD?

Thanks for all that help (although most of the "HD" vocabulary that you used went right over my head...)

Last night, I actually figured out that our HD DVR cable box was broadcasting on 2 different channel inputs instead of the standard of just one.

The cable box was set on the old "standard" input, and I didn't realize that the "HD" channels are only shown in "HD" if you switch to the "component" input channel.

I went to Wal-Mart today and decided to by an HDMI cable anyways (but I guess I'll just return it if the picture/sound quality remains the same).

HDTV is overall way too complicated. They should think about adding the phrase "You are now watching in "FULL/TRUE/REAl/ETC. High Definition" instead of making people guess all the time.

Re: Is Having an HDMI Cable Neccessary to view 'True' HD?

I'm certainly sympathetic to all consumers who feel just a bit lost in the technological swirl of the past few years. The problem of learning about matters that were never before required in order to put on your tv set for an evening's viewing is compounded by the deliberate hype, puffery, and mis-information thrown into the mix by commercial vendors.

As one of the other respondents has already said, the term "HD" or "High-Definition" implies everything from rather modest improvement in the "tightness" of your tv picture to an eight-fold (or more) increase in sharpness. Once you've seen a 1080p presentation of something you've been accustomed to watching at standard rez on you local cable system, you'll have a tough time dropping back to the old standard.

The worst picture that we've all become accustomed to is found on a perfectly clean, just out-of-the-box VHS tape of a first rate movie. Even though the picture has no spots or flecks or variations of color or contrast, its definition of small detail is right up there with French Impressionist water-colors or kindergarten finger painting. Buy yourself a Blu-Ray Disc of that movie, and "stands back away from yer televisionary set" . The sharpness and 3-dimensional quality of the image are breathtaking.

That change in how many pinpoints of stored "light" are sent to your television in a given moment of time, such as one second, is what is meant when we say that an image is HD.

However, it is possible to send a terrifically defined image to an electrical device capable of reading "Analog" signals. Not DIGITAL. That term "Digital" is NOT synonymous with "clarity" or "definition. A Digital image can look like it was shot on an automatic finger painter, and conversely, an analog picture can blow you away.

High Def and Digital are separate phenomena, but certain unethical or just dumb salespeople will use the terms interchangeably, as if "Digital Cable" will deliver a crystal clear image. The Satellite service "Direct TV" sends out an all-digital signal of about 400 lines of horizontal resolution on many of its standard definition channels, which looks pretty good since competing signal suppliers like Cable or over-the-air broadcast channels can run as low as 300 lines of rez.

High Definition sources CAN contain lots more detail, often expressed as Lines Of Resolution, which run as high as 1,080. Notice that I still haven't said boo about whether the signal is digital or analog. LINES OF RESOLUTION, also expressed in terms of Pixel Count, such as "1900 x 1200", is the term to watch for when comparing the capabilities of one piece of hardware with another.

Have fun! And one more thought, for many Hi-Def receivers the proper input socket is called a DVI. It's bigger than an HDMI port, and accepts only the video part of the signal. I hope this all helps just a little.

North Wales, PA USA