About Us
Newsletter
Store
Advertise With Us
  Articles
Sitemap
Advertisers
Contact Us
 

David A. Wimsett
Providing Technological Solutions

David's interest in technology extends back to his teen years with science fairs and amateur radio. He is fascinated with hardware, but his real interest is how people use technology. David's goal in providing technological solutions has always been to listen to the end users.


Moving to HD TV

These times they are a changin', and so is your television.

Since its inception, electronic TV has worked in pretty much the same way. Although colour and stereo sound were added over the years, the resolution of the image and the way it is transmitted have remained unchanged. Now, a revolution is about to occur.

A new kind of TV is coming with an amazing picture and superior sound quality. The change will take place on February 17, 2009 in the United States and in August of 2011 in Canada . At that time, the standard TV signal we have known for over 70 years will be switched off and all stations will change to a new, high definition (HD) digital signal. In fact, some stations have already moved into HD while simulcasting in the older format. On the cut off dates, those simulcasts will end and all TV will be digital. But don't panic. You will not loose your favourite programs.

The United States and Canada share a standard for TV broadcast that was established by the U.S. National Television System Committee (NTSC). It defines a picture consisting of 525 horizontal lines made up of tiny, dots. Of these, 486 are visible. The remaining 39 are used to synchronize the picture and carry other information such as closed captioning (you see them as a band between pictures when the TV rolls). Each picture is called a frame. A new frame is shown every 30 th of a second to create the sense of motion.

Screen sizes vary for standard TV, but they are always 4 units wide and 3 units high, written as 4:3 and pronounced “four to three”. The 4:3 aspect ratio matches the standard motion picture frame that was popular before wide screen formats became dominant. When you play a wide screen movie on a standard TV you see black bars at the top and bottom. This is called letterboxing.

Standard NTSC TV is transmitted over radio waves as an analog signal. Analog signals use either the height of the wave (called amplitude modulation or AM) or the distance between waves (called frequency modulation or FM) to represent information such as sound or luminosity. TV is broadcast in AM. The amplitude represents the luminosity of each dot in a line, the larger the intensity, the greater the luminosity.

Radio waves vibrate at a specific frequency or Hertz (Hz). To carry the television signal, a range of adjacent frequencies is required. This is called bandwidth. All the frequencies used for TV are a small part of the electromagnetic spectrum that runs from radio waves to microwaves through infrared light, visible light, ultraviolet light, x-rays and gamma rays. These frequencies are finite. You cannot add new ones.

The TV frame appears on the front end of a picture tube whose inside surface is covered with tens of thousands of the tiny dots, each with a phosphor coating that is red or green or blue. These are the primary colours for projected light. When combined, they can display the full spectrum of the rainbow in what is referred to as the RGB colour palette. The phosphors are grouped close together into units of 3 called triads, each containing one red, green and blue phosphor.

As the signal is received, circuitry shoots streams of electrons from the rear of the picture tube. When they strike a phosphor, it glows. The intensity of the glow varies with the amplitude of the wave. Different colours are created by adjusting the relative luminance of the red, green and blue phosphors in each triad.

Every dot of phosphor is represented in the wave, even those that are black. It takes two passes of the electron stream to show one full frame, each pass lasting one 60 th of a second. The first pass illuminates the odd number lines and the second pass the even lines in a process called interlacing. The result is a complete frame that is formed in one 30 th of a second. Interlacing was introduced to decrease screen flicker without having to use extra bandwidth, thus saving frequencies and money that would be required for more sophisticated electronics. It was originally set at a 60 th of a second to eliminate interference with North American AC power that cycles at 60Hz. Because the two scans occur in a fraction of a second, our brains combine them into a single image.

One shortcoming of interlacing is that the two images are recorded at slightly different times, leading to video artefacts such as blurring when fast moving objects are shown. A second artefact is the moiré pattern that appears as striped lines where detail cannot be resolved between the interlaced scans.

Another way to build a TV picture is progressive scanning in which the entre frame is captured and displayed at the same instant. However, progressive scan broadcasts require more bandwidth and, hence, the equipment is more expensive.

Next: Part II – Moving to HD TV

 

Main page - David A. Wimsett