John J. Hopfield

John J. HopfieldAmerican physicist John J. Hopfield was awarded the 2024 Nobel Prize in Physics for his work on neural networks.

John J. Hopfield (born July 15, 1933, Chicago, Illinois, U.S.) is an American physicist who was awarded the 2024 Nobel Prize in Physics for his work with neural networks. He shared the prize with British-Canadian cognitive psychologist Geoffrey Hinton. At age 91, he became the third oldest person to receive a Nobel Prize, after John B. Goodenough (in chemistry in 2019, at age 97) and Arthur Ashkin (in physics in 2018, at age 96).

Hopfield, born to parents who were physicists, took an interest in the subject from an early age; he would later describe physics as not just a subject matter but rather a way in which to view the physical world. His parents encouraged his inquisitive nature, and he was allowed to take apart various items in his home, which would sometimes necessitate his father’s help in piecing them back together.

“The central idea was that the world is understandable, that you should be able to take anything apart, understand the relationships between its constituents, do experiments, and on that basis be able to develop a quantitative understanding of its behavior.” —Hopfield describing how he viewed physics

In 1954 Hopfield received a bachelor’s degree in physics at Swarthmore College. Four years later he received a Ph.D. in physics from Cornell University. Hopfield then joined AT&T Bell Laboratories, where he worked on solid-state physics research. Starting in 1964, he was a professor of physics at Princeton University. By the end of his time there, Hopfield had moved away from physics to problems in chemistry and biology, and he then became a professor of those subjects at the California Institute of Technology (Caltech) in 1980.

Hopfield became interested in how the brain works and, specifically, how neurons work together. In 1982 he proposed a simple network that would explain how memories are stored in the brain. Neurons could be in either one of two states: 0, “not firing,” or 1, “firing at maximum rate.” The connections between the neurons had a certain strength. By analogy with the mathematics that describes magnetic systems, Hopfield was able to describe an “energy” for the system that was −1 times a sum over all the pairs of neurons with the terms of the sum being the strength of the connection between two neurons times the value of the state of each neuron. The strength of the connection term was set so that memories would be in the lowest energy state of the system.

Hopfield extended his model (which came to be called the Hopfield network) to have more complex features. For example, the neurons could be in any state, not just 0 and 1. Thus, the model could store more complex information. Hopfield and American neuroscientist David Tank used such a network to solve the traveling salesman problem.

At Caltech, Hopfield and his colleague Carver Andress Mead created a new interdisciplinary program in computation and neural systems in 1986. Hopfield returned to Princeton, as a professor of molecular biology, in 1997 and there helped create the Princeton Neuroscience Institute. He retired and became a professor emeritus in 2008.

Hopfield has won multiple awards and held many prestigious positions during his career. He won the Buckley Prize (with D.G. Thomas), awarded by the American Physical Society, in 1969 for his work on light-emitting diodes (LEDs). He won a MacArthur grant in 1983. He was named California Scientist of the Year in 1991 and won the Albert Einstein Award of Science in 2005. Hopfield also was awarded a Guggenheim fellowship (just as his father had been, 40 years earlier) in 1968.

Erik Gregersen Tara Ramanathan