whole genome sequencing

genetics
print Print
Please select which sections you would like to print:
verifiedCite
While every effort has been made to follow citation style rules, there may be some discrepancies. Please refer to the appropriate style manual or other sources if you have any questions.
Select Citation Style
Feedback
Corrections? Updates? Omissions? Let us know if you have suggestions to improve this article (requires login).
Thank you for your feedback

Our editors will review what you’ve submitted and determine whether to revise the article.

whole genome sequencing, the act of deducing the complete nucleic acid sequence of the genetic code, or genome, of an organism or organelle (specifically, the mitochondrion or chloroplast). The first whole genome sequencing efforts, carried out in 1976 and 1977, focused respectively on the bacteriophages (bacteria-infecting viruses) MS2 and ΦX174, which have relatively small genomes. Since then there have been numerous innovations in the field of DNA sequencing that have expanded the capabilities of the technology. Those innovations, combined with increasing cost-effectiveness in the early 21st century, enabled the routine use of whole genome sequencing in laboratories worldwide, which effectively ushered in a new era of biological discovery. The power of the approach has been realized in the study of human populations and human diseases such as cancer, as well as in the elucidation of whole genome sequences of crop plants, livestock, and other species of scientific or agricultural significance. Thus, it is acknowledged generally that there exists great value in a detailed understanding of the nucleic acid sequence—especially the variations in the sequence that correlate with predisposition to health or disease states or with other properties of societal or economic significance in microbial, animal, and plant populations.

Sequencing methods: from genes to genomes

In 1944 Canadian-born American bacteriologist Oswald Avery and colleagues recognized that the hereditary material passed from parent to offspring was DNA. Subsequent genetic analyses carried out by other scientists on viruses, bacteria, yeast, fruit flies, and nematodes demonstrated that the intentional induction of mutations that disrupted the genetic code, combined with the analysis of observable traits (phenotypes) produced by such mutations, were important approaches to the study of gene function. Such studies, however, were able to query only a fraction of genes in a genome.

The first sequencing methods (the Maxam-Gilbert and Sanger methods), developed in the 1970s, were deployed to reveal the nucleic acid composition of individual genes and the relatively small genomes of certain viruses. The sequencing of larger genomes remained out of reach conceptually, because of high costs and the effort required, until the launch of the Human Genome Project (HGP) in 1990 in the United States. Although the project was not universally embraced, some recognized that technology had evolved to the point where whole genome sequencing of larger genomes could be considered realistically. Particularly important was the development of automated sequencing machines that employed fluorescence instead of radioactive decay for the detection of the sequencing reaction products. Automation offered new possibilities for scaling up the production of DNA sequencing to tackle large genomes.

An early aim of the HGP was to obtain the whole genome sequences of important experimental model organisms, such as the yeast Saccharomyces cerevisiae, the fruit fly Drosophila melanogaster, and the nematode Caenorhabditis elegans. In sequencing those smaller and therefore more-tractable genomes, three outcomes were anticipated. First, the sequences would be of value to the research community, serving to accelerate efforts to understand gene function by using model systems. Second, the experience gained would inform approaches to sequencing the human genome and other similarly sized genomes. Third, functional relationships between sequences of different organisms would be revealed as a consequence of cross-species sequence similarity. Ultimately, with the involvement of more than one thousand scientists worldwide, two human genome sequences were published in 2001. With this development came established methods and analytic standards that were used to sequence other large genomes.

A major challenge for de novo sequencing, in which sequences are assembled for the very first time (such as with the HGP), is the production of individual DNA reads that are of sufficient length and quality to span common repetitive elements, which are a general property of complex genome sequences and a source of ambiguity for sequence assembly. In many of the early de novo whole genome sequencing projects, emphasis was placed on the production of so-called reference sequences, which were of enduring high quality and would serve as the foundation for future experimentation.

An important approach used by many projects that sequenced large genomes involved hierarchical shotgun sequencing, in which segments of genomic DNA were cloned (copied) and arranged into ordered arrays. Those ordered arrays were known as physical maps, and they served to break large genomes into thousands of short DNA fragments. Those short fragments were then aligned, such that identical sequences overlapped, thereby enabling the fragments to be linked together to yield the full-length genomic sequence. The fragments were relatively easy to manipulate in the laboratory, could be apportioned among collaborating laboratories, and were amenable to the detailed error-correction exercises important in generating the high-quality reference sequences sought by HGP scientists. Some genome projects were conducted without the use of such maps, using instead an approach called whole genome shotgun sequencing. This approach avoided the time and expense needed to create physical maps and provided more-rapid access to the DNA sequence.

Whether using physical maps or the whole genome shotgun sequencing approach, the sequencing exercise involved randomly fragmenting either cloned (copied) or native genomic DNA into very short segments that could then be inserted into bacterial cells as plasmids for amplification, producing many copies of the segments, prior to nucleic acid purification and sequence analysis. In a process known as assembly, computer programs were then used to stitch the sequences back together to reconstruct the original DNA sequencing target. Assembly of whole genome shotgun sequencing data was difficult and required sophisticated computer programs and powerful supercomputers, and, even in the years following the completion of the HGP, whole genome shotgun sequence assembly remained a significant challenge for whole genome sequencing projects.

Get Unlimited Access
Try Britannica Premium for free and discover more.

Next-generation technologies

Although the first whole genome sequences were in themselves technological and scientific feats of significance, the scientific opportunities and the host of technologies those projects spawned have had even greater impacts. Among the most significant technological developments has been in the area of next-generation DNA sequencing technologies for human genome analysis. Certain of those technologies originally were designed to re-sequence genomes (as opposed to de novo sequencing). In re-sequencing, short sequences are produced and aligned computationally to existing reference genome sequences generated, at least initially, using the older de novo sequencing methods. Next-generation sequencing approaches are characterized generally by the massively parallel production of short sequences, in which multiple DNA fragments are generated simultaneously and in sufficient quantity to redundantly represent every base in the target genome. Although such technologies propelled whole genome sequencing into the mainstream of biology, innovation persisted as companies and academic laboratories strived to reach the “$1,000 genome”—the mapping of an individual human genome for less than $1,000 (U.S.), which was anticipated in 2012.

Marco Marra