Caltech Professor Emeritus John Hopfield Wins Nobel Prize in Physics
Caltech professor emeritus John Hopfield has been awarded the Nobel Prize in Physics along with Geoffrey Hinton of the University of Toronto "for foundational discoveries and inventions that enable machine learning with artificial neural networks," according to the award citation. Hopfield, who is currently a professor of molecular biology at Princeton University, served as a professor of chemistry and biology at Caltech from 1980 to 1996 and is currently the Roscoe G. Dickinson Professor of Chemistry and Biology, Emeritus, at Caltech. He co-founded Caltech's Department of Computation and Neural Systems in 1986.
"John moved from Bell Labs and Princeton, where he studied physics, to Caltech's chemistry and chemical engineering and biology departments for a joint appointment in 1980," says Peter Dervan, Bren Professor of Chemistry, Emeritus, at Caltech. "He was interested in building a model of how the brain works and became renowned for his model of neural networks. He was our beloved colleague for 17 years until 1996 when he returned to Princeton."
Today's machine learning, or artificial intelligence (AI), tools—such as ChatGPT and other programs that can assemble information and seemingly converse with humans—have roots in Hopfield's pioneering work on artificial neural networks. In the early 1980s, Hopfield merged his background in physics with neurobiology to create a simple computer model that behaved less like the computers of that time and more like the human brain.
According to the Nobel Prize committee's popular summary of the prize work, Hopfield's research interests "had taken him outside the areas in which his colleagues in physics worked," leading him to move across the continent to Caltech. "There, he had access to computer resources that he could use for free experimentation and to develop his ideas about neural networks," the committee wrote.
Referred to as the Hopfield network, his computer model mimics the architecture of the human brain to store information. The Hopfield network consists of nodes that are connected to each other like neurons in the human brain; the connections between nodes can be made stronger or weaker, such that the strong connections form memories.
"Hopfield's network is a generalization of the spin glass model in physics," explains Erik Winfree (PhD '98), professor of computer science, computation and neural systems, and bioengineering at Caltech, and a former graduate student of Hopfield's. "A spin glass will have two magnetized states, which you could call its memories. By enhancing the model to allow distinct connection strengths between each pair of units, Hopfield showed that multiple complex patterns could be stored in the network using a simple learning rule. It was a new way to conceptualize brain-like memories. Hopfield articulated this model in a way that opened people's eyes to the connections between artificial neural networks and physics."
Hopfield's insights, says Winfree, came from making connections between fields. "He was a chemist, he was a biologist, he was a physicist, he was a computer scientist."
Hopfield echoed this sentiment himself in a Princeton news conference today, October 8, "In the long run, new fields of science grow up at the intersection of big chunks of science, which are the pieces of knowledge now. And you have to be willing to work in those interstices in order to find out what are the limitations of knowledge that you have and what might you do to make the subjects richer, deeper, better understood."
Shortly after joining the Caltech faculty, Hopfield teamed up with the late physicist Richard Feynman and with Carver Mead (BS '56, PhD '60), the Gordon and Betty Moore Professor of Engineering and Applied Science, Emeritus, to co-teach a yearlong course called The Physics of Computation. The course was intended to unite their respective fields to explore the relationship between nanoscale physics, computation, and brain function. Three years later, the class evolved into a new interdivisional program at Caltech called Computation and Neural Systems (CNS), a vibrant community of scholars that includes dozens of faculty and has produced more than 100 PhDs.
"We never even thought about it being a department," says Mead. "We had lunch at the Athenaeum together. I was in engineering, John was in chemistry, and Dick [Feynman] was in physics, and we had such good arguments that we thought we should have this discussion with the students. It was the best of all worlds, and that happens at Caltech."
Mead says he was part of the recruiting effort to bring Hopfield to Caltech. "He was working on this great science, and it made sense to have him here. We had a lot of good times together. We thought so differently, and that was great because it pushed each of us to think the other way about the thing we were looking at," he says.
As an educator and a researcher, Hopfield exemplified Caltech's commitment to curiosity-driven science and to encouraging and supporting faculty to pursue areas of interest at the intersection of fields, recalls Harry Gray, the Arnold O. Beckman Professor of Chemistry and founding director of the Beckman Institute. "We were successful in recruiting him to join forces with Caltech in chemistry and biology, and his work here led to Nobel recognition."
Though Hopfield was a theorist, says Rudy Marcus, Caltech's John G. Kirkwood and Arthur A. Noyes Professor of Chemistry and himself a Nobel laureate, there were times when he acted like an experimentalist. "As one example of his ingenuity, I recall that he once used his physics background to locate an underground leak in a water pipe at his house and repair it," says Marcus. "It impressed me at the time, since this was far beyond my skills."
In a similar vein, Paul Sternberg, Bren Professor of Biology and chair of Caltech's Division of Biology and Biological Engineering, recalls that Hopfield told him that "he was attracted to problems for which people could not imagine how something could possibly work."
Winfree says that Hopfield's physics-oriented mind helped him with his own research into self-assembling molecules back when he was a graduate student and working in Hopfield's group. "I told him about the model of molecular self-assembly that I was developing, and he asked me, 'Do you really need all 17 parameters, or can you reduce it to two? '" When I threw out the inessential details, suddenly I could see the fundamental issues."
Many Caltech faculty also commented on Hopfield's profound and enduring impact in inspiring and training future scientists, engineers, and researchers in applied and theoretical science.
"His students at Caltech went on to illustrious careers of their own," says William Goddard (PhD '65), the Charles and Mary Ferkel Professor of Chemistry, Materials Science, and Applied Physics. "Thus, besides his fundamental research, he was an inspirational teacher and a great mentor."
Hopfield was born in 1933 in Chicago. He earned his bachelor's degree from Swarthmore College in 1954 and his PhD from Cornell University in 1958. He has received numerous awards, including the Boltzmann Medal (2022), the Benjamin Franklin Medal in Physics (2019), the Albert Einstein Award (2005), the Dirac Medal of the International Centre for Theoretical Physics (2001), and the MacArthur Fellowship (1983–1988).
In Princeton's news release about the Nobel Prize, Hopfield discussed what drives him. "The science which advances technology is the science that gets done for curiosity's sake much earlier," he said.
As for the AI technology his work did spawn, he said in the Princeton news conference, "I worry about anything that says I'm big, I'm fast, I'm bigger than you are, I'm faster than you, and I can also outrun you. Now, can you peacefully inhabit with me? I don't know."
Hopfield's Nobel Prize is the 48th awarded to Caltech faculty, alumni, and postdoctoral scholars.