videoconferencing

communications
print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

Also spelled:
video conferencing
Related Topics:
webcamming

videoconferencing, refers to the transmission of pictures and imagery (via video) and sounds (via audio) between two or more physically separate locations. Once the sole province of the corporate boardroom, videoconferencing is used today in telemedicine, distance education, theatrical productions, political trials, and other circumstances in which the ability to “be here now” is desired. Travel fears immediately following the September 11, 2001, terrorist attacks, and the global coronavirus pandemic in early 2020, which resulted in the shut down of businesses and schools around the world, combined with later technological advances to make videoconferencing a mainstay of contemporary life.

The rise and fall of the Picturephone

The first conceptualization of image transfer emerged along with the development of wire-delivered audio in the 1870s, but the first formal attempts at videoconferencing began in the United States in the 1920s. In 1927 Bell Labs connected Secretary of Commerce Herbert Hoover and other officials in Washington, D.C., with AT&T president Walter Gifford in New York City; the two-way audio connection was accompanied by a one-way video connection from Washington. D.C., to New York. Experimentation continued during the 1930s in Europe, where television technologies were more mature. By 1964, AT&T was ready to introduce its first public videoconferencing tool—a videophone called the Picturephone—at the World’s Fair. After spending $500 million on development and predicting one million users by 1980, AT&T was stunned when the Picturephone turned out to be a financial failure, garnering only about 500 subscribers.

Some researchers have suggested that the Picturephone failed because people simply disliked face-to-face communication via telephone at the time. Others maintained that it was wrongly designed for one-to-one communication, whereas “multipoint” communication (conference calls) is how most businesses conduct remote conferencing. Some industry experts noted that much of the 1970s and 1980s were spent developing multipoint technologies, with telecommunications company Post Office Telecommunications (which became British Telecom in 1981) persuading telephone companies around Europe to conduct country-to-country trials of equipment. These trials paved the way for the videoconferencing interoperability standards that exist today and allow multipoint videoconferencing such as Stanford University’s “virtual auditorium” project.

videophone
More From Britannica
videophone: Videoconferencing

Modern developments

In its early days, videoconferencing seemed to many to be more a high-tech toy than a necessary communications tool: Videoconferences could be conducted only out of specially equipped rooms costing from $250,000 to $1,000,000 or more. It was only after the arrival of low-cost solid-state memory in the late 1980s that “set-top” systems capable of converting television units into videoconferencing tools started becoming available, with prices at a more manageable $10,000 or so. Set-tops, often configured to stand on rolling carts, were responsible for broadening videoconferencing’s uses beyond the traditional office. New Zealand director Peter Jackson used it to coordinate multiple film crews shooting The Lord of the Rings films (the first of which was released in 2001) in remote New Zealand locations, and U.S. President George W. Bush famously videoconferenced with his advisers in Washington, D.C., on days when he was at his Texas ranch.

Whether room-based or on the personal computer or mobile device, the mechanics of videoconferencing are the same. First, analog signals from video cameras and microphones are translated to digital information inside a personal computer. This is done by way of a codec (a truncated combination of the words “compression” and “decompression”), a compression and decompression standard that is often called the heart of a videoconferencing system. Codecs, which can be either hardware or software, compress outgoing information so it can be quickly and successfully sent via telephone connections; later, the codec decompresses incoming information for viewing. Finally, if multipoint videoconferencing is desired, a multipoint processing unit, sometimes called a “bridge” (also called multipoint videoconferencing units, or MCUs), must also be implemented. Another solution, called “multicasting,” operates in a slightly different manner; instead of employing a videoconferencing server, the process can link one user to another directly (such as from personal computer to personal computer).

With high bandwidth connections, videoconferences can be viewed at so-called “television quality” levels—that is, at 30 frames per second. Unfortunately, videoconferencing still seems to many to be a technology that promises more than it delivers. For example, because it is often more important to hear what is being said in a meeting than it is to see clearly, it is a rule of thumb that a good videoconferencing system will sacrifice video for audio clarity. Videoconferencing problems remain, such as the awkwardness participants experience comprehending the mechanics of half-duplex sound (the “walkie-talkie” echo resulting from the system allowing sound signals from one party at a time to travel) and figuring out turn-taking when full-duplex systems (in which the signals from all parties can travel simultaneously) are used.

IP-based and desktop videoconferencing

When discussing high-quality videoconferencing, it is important to understand that, almost without exception, transmissions are occurring via high-bandwidth wired and wireless connections or via satellite. IP-based videoconferencing (which transfers data by means of internet protocols [IP]) stemmed from experiments like those of Canadian computer scientist William Buxton and American scientist Jaron Lanier, using the power of Internet 2, an exclusive, very high-bandwidth network created by more than 220 universities. Videoconferencing solutions over high-bandwidth IP is also provided by telecommunication companies, such as Sprint.

Get Unlimited Access
Try Britannica Premium for free and discover more.

The practice of desktop videoconferencing over the Internet began in 1994, when Intel introduced ProShare for Windows PCs, which required only a webcam and microphone, a sound card, a reasonably high-speed Internet connection, and videoconferencing software such as Microsoft NetMeeting or iSpQ. Experiments with videoconferencing over the Internet were conducted as early as the 1970s, but most were abandoned during that period due to limitations in bandwidth. Images and video characterized by poor resolution and poor sound quality were the mainstay until the early 21st century, when technological advancements allowed for additional bandwith.

Since the early 2000s, however, videoconferencing technology for consumers and businesses has evolved. Low-resolution products distributed solely over the Internet gave way to a high-quality services built into consumer- and corporate-driven wired and wireless networks. For the mobile-device-buying public and a number of businesses, several “video chat” applications were developed during this period, some of the more well-known being Skype (which made its debut in 2003) and FaceTime (which was released by Apple Inc. in 2010). Other videoconferencing systems, such as the high-definition videoconferencing system developed by Lifesize Communications in 2005, became available for businesses, as well. In early 2020, the global coronavirus pandemic resulted in the widespread closure of schools and businesses, meaning students and employees around the world were forced to work from home. Videoconferencing software such as Zoom gained greatly from this development. Zoom became one of the most popular services of its kind, one of the most downloaded applications worldwide, and a household word. 

Videoconferencing is increasingly used worldwide in a variety of ways, altering what it means to “be here now” on a daily basis. In the early 2000s, the U.S. Securities and Exchange Commission debated what constituted a binding agreement arrived at during a videoconference, while the United Nations used videoconferencing to help remote witnesses to testify safely during Rwanda’s genocide trials, called the International Criminal Tribunal for Rwanda (ICTR). These examples illustrate that, even more than its financial future, videoconferencing’s social presence begs deeper analysis and discussion.

Theresa M. Senft