European colour systems

verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

News

Ron Ely, TV's 'Tarzan' in the 1960s, dies at 86 Oct. 23, 2024, 7:37 PM ET (AP)

In the United States, broadcasting using the NTSC system began in 1954, and the same system has been adopted by Canada, Mexico, Japan, and several other countries. In 1967 the Federal Republic of Germany and the United Kingdom began colour broadcasting using the PAL system, while in the same year France and the Soviet Union also introduced colour, adopting the SECAM system.

PAL and SECAM embody the same principles as the NTSC system, including matters affecting compatibility and the use of a separate signal to carry the colour information at low detail superimposed on the high-detail luminance signal. The European systems were developed, in fact, to improve on the performance of the American system in only one area, the constancy of the hue of the reproduced images.

It has been pointed out that the hue information in the American system is carried by changes in the phase angle of the chrominance signal and that these phase changes are recovered in the receiver by synchronous detection. Transmission of the phase information, particularly in the early stages of colour broadcasting in the United States, was subject to incidental errors arising in broadcasting stations and network connections. Errors were also caused by reflections of the broadcast signals by buildings and other structures in the vicinity of the receiving antenna. In subsequent years, transmission and reception of hue information became substantially more accurate in the United States through care in broadcasting and networking, as well as by automatic hue-control circuits in receivers. Since the late 1970s a special colour reference signal has been transmitted on line 19 of both scanning fields, and circuitry in the receiver locks onto the reference information to eliminate colour distortions. This vertical interval reference (VIR) signal includes reference information for chrominance, luminance, and black.

PAL and SECAM are inherently less affected by phase errors. In both systems the nominal value of the chrominance signal is 4.433618 megahertz, a frequency that is derived from and hence accurately synchronized with the frame-scanning and line-scanning rates. This chrominance signal is accommodated within the 6-megahertz range of the fully transmitted side band, as shown in the figure. By virtue of its synchronism with the line- and frame-scanning rates, its frequency components are interleaved with those of the luminance signal, so that the chrominance information does not affect reception of colour broadcasts by black-and-white receivers.

PAL

PAL (phase alternation line) resembles NTSC in that the chrominance signal is simultaneously modulated in amplitude to carry the saturation (pastel-versus-vivid) aspect of the colours and modulated in phase to carry the hue aspect. In the PAL system, however, the phase information is reversed during the scanning of successive lines. In this way, if a phase error is present during the scanning of one line, a compensating error (of equal amount but in the opposite direction) will be introduced during the next line, and the average phase information (presented by the two successive lines taken together) will be free of error.

Two lines are thus required to depict the corrected hue information, and the vertical detail of the hue information is correspondingly lessened. This produces no serious degradation of the picture when the phase errors are not too great, because, as is noted above, the eye does not require fine detail in the hues of colour reproduction and the mind of the observer averages out the two compensating errors. If the phase errors are more than about 20°, however, visible degradation does occur. This effect can be corrected by introducing into the receiver (as in the SECAM system) a delay line and electronic switch.

SECAM

In SECAM (système électronique couleur avec mémoire) the luminance information is transmitted in the usual manner, and the chrominance signal is interleaved with it. But the chrominance signal is modulated in only one way. The two types of information required to encompass the colour values (hue and saturation) do not occur concurrently, and the errors associated with simultaneous amplitude and phase modulation do not occur. Rather, in the SECAM system (SECAM III), alternate line scans carry information on luminance and red, while the intervening line scans contain luminance and blue. The green information is derived within the receiver by subtracting the red and blue information from the luminance signal. Since individual line scans carry only half the colour information, two successive line scans are required to obtain the complete colour information, and this halves the colour detail, measured in the vertical dimension. But, as noted above, the eye is not sensitive to the hue and saturation of small details, so no adverse effect is introduced.

To subtract the red and blue information from the luminance information and obtain the green information, the red and blue signals must be available in the receiver simultaneously, whereas in SECAM they are transmitted in time sequence. The requirement for simultaneity is met by holding the signal content of each line scan in storage (or “memorizing” it—hence the name of the system, French for “electronic colour system with memory”). The storage device is known as a delay line; it holds the information of each line scan for 64 microseconds, the time required to complete the next line scan. To match successive pairs of lines, an electronic switch is also needed. When the use of delay lines was first proposed, such lines were expensive devices. Subsequent advances reduced the cost, and the fact that receivers must incorporate these components is no longer viewed as decisive.

Since the SECAM system reproduces the colour information with a minimum of error, it has been argued that SECAM receivers do not have to have manual controls for hue and saturation. Such adjustments, however, are usually provided in order to permit the viewer to adjust the picture to individual taste and to correct for signals that have broadcast errors, due to such factors as faulty use of cameras, lighting, and networking.

Digital television

Governments of the European Union, Japan, and the United States are officially committed to replacing conventional television broadcasting with digital television in the first few years of the 21st century. Portions of the radio-frequency spectrum have been set aside for television stations to begin broadcasting programs digitally, in parallel with their conventional broadcasts. At some point, when it appears that the market will accept the change, plans call for broadcasters to relinquish their old conventional television channels and to broadcast solely in the new digital channels. As is the case with compatible colour television, the digital world is divided between competing standards: the Advanced Television Standards Committee (ATSC) system, approved in 1996 by the FCC as the standard for digital television in the United States; and Digital Video Broadcasting (DVB), the system adopted by a European consortium in 1993.

The process of converting a conventional analog television signal to a digital format involves the steps of sampling, quantization, and binary encoding. These steps, described in the article telecommunication, result in a digital signal that requires many times the bandwidth of the original wave form. For example, the NTSC colour signal is based on 483 lines of 720 picture elements (pixels) each. With eight bits being used to encode the luminance information and another eight bits the chrominance information, an overall transmission rate of 162 million bits per second would be needed for the digitized television signal. This would require a bandwidth of about 80 megahertz—far more capacity than the six megahertz allocated for a channel in the NTSC system.

To fit digital broadcasts into the existing six- and eight-megahertz channels employed in analog television, both the ATSC and the DVB system “compress” bit rates by eliminating redundant picture information from the signal. Both systems employ MPEG-2, an international standard first proposed in 1994 by the Moving Picture Experts Group for the compression of digital video signals for broadcast and for recording on digital video disc. The MPEG-2 standard utilizes techniques for both intra-picture and inter-picture compression. Intra-picture compression is based on the elimination of spatial detail and redundancy within a picture; inter-picture compression is based on the prediction of changes from one picture to another so that only the changes are transmitted. This kind of redundancy reduction compresses the digital television signal to about 4 million bits per second—easily enough to allow multiple standard-definition programs to be broadcast simultaneously in a single channel. (Indeed, MPEG compression is employed in direct broadcast satellite television to transmit almost 200 programs simultaneously. The same technique can be used in cable systems to send as many as 500 programs to subscribers.)

However, compression is a compromise with quality. Certain artifacts can occur that may be noticeable and bothersome to some viewers, such as blurring of movement in large areas, harsh edge boundaries, and an overall reduction of resolution.