Rocky Mountain Regional VA Medical Center registered nurse Patricia Stamper looks at a dose of the Pfizer-BioNTech COVID-19 vaccine before administering it to a health care worker at the hospital on December 16, 2020 in Aurora, Colorado.
Michael Ciaglo | Getty Images
The rapid development of vaccines for Covid has led to some debate over who deserves the most credit: the government with its Operation Warp Speed, drug companies, or university researchers who pioneered discoveries about messenger RNA.
The best answer, I think, is that development of the vaccines, like most other great American innovations over the past 75 years, has largely been due to a singular decision made after World War II to tightly intertwine the roles played by government, private industry and academia.
This triple helix was designed by the influential science administrator Vannevar Bush, who had a foot in all three camps. He was dean of engineering at MIT, a founder of Raytheon, and then the chief government science administrator during World War II overseeing, among other projects, the building of the atomic bomb.
In a 1945 report to President Truman with the quintessentially American title, “Science, The Endless Frontier,” Bush recommended that government should not build big research labs of its own, as it had done for the atomic bomb project, but instead should fund research at universities and corporate labs.
“No American has had greater influence in the growth of science and technology than Vannevar Bush,” MIT President Jerome Wiesner later proclaimed, adding that his “most significant innovation was the plan by which, instead of building large government laboratories, contracts were made with universities and industrial laboratories.”
Much of the government’s postwar science funding went to basic, curiosity-driven research that did not yet have known practical applications, such as how quantum mechanics might explain what happens on the surface of semiconducting materials or how snippets of RNA act as messengers to build proteins. Bush knew that discoveries in basic science would be the seed corn that would eventually grow into unforeseen inventions, such as transistors or mRNA vaccines.
This government-academic-corporate partnership produced the great innovations that propelled the U.S. economy in the postwar period, including microchips, computers, graphical user interfaces, GPS, lasers, the internet and search engines. Google, for example, was begun by Larry Page and Sergey Brin as an academic project at Stanford partly funded by the National Science Foundation.
Over the years, an imperfect but productive system was patched together for divvying up the proceeds and intellectual property. In 1980, for example, Congress passed the Bayh-Dole Act, which made it easier for universities to benefit from patents, even if the research was funded by the government.
One of the most important innovations of our era will be the gene-editing technology known as CRISPR. One of its inventors is Berkeley professor Jennifer Doudna, who was a winner of this year’s Nobel Prize and is locked in a protracted patent battle with Feng Zhang of the Broad Institute at MIT and Harvard.
They and their institutions are good examples of the government-academic-corporate interrelationship. Their academic research was funded partly by grants from the National Institutes of Health and the Defense Advanced Research Projects Agency, and they both started private companies to commercialize their CRISPR discoveries for gene editing, disease diagnosis, and now coronavirus detection.
This process also led to the Covid vaccines. Over the years, the NIH and DARPA have funded university research into how DNA and RNA work. For example, in 2005, a pair of researchers at the University of Pennsylvania, Katalin Kariko and Drew Weissman, showed how to tweak a molecule of messenger RNA so that it could get into human cells without being attacked by the body’s immune system.
Shortly after that, two entrepreneurial start-ups were founded to commercialize medical uses for this mRNA: BioNTech in Germany and Moderna in Cambridge, Massachusetts. When the Covid pandemic struck, they devised ways to use mRNA to instruct human cells to make parts of a spike protein that would stimulate immunity to the coronavirus. They were aided by guaranteed purchase agreements and logistical support from the government’s Operation Warp Speed.
The government-academic-corporate helix that Bush envisioned has given rise to cauldrons of innovation around great research universities. Silicon Valley began growing around Stanford in the 1950s when its provost, Frederick Terman, began encouraging professors and graduate students to commercialize their discoveries, which led to birth of such companies as Hewlett-Packard, Cisco, Sun, and Google.
Kendall Square in Cambridge is the new Silicon Valley. Located next to MIT and near Harvard, it houses centers of more than 120 biotech companies within a mile of each other, including Moderna, Pfizer, Merck, Novartis and Sanofi.
And increasingly, this model of great universities encouraging commercialization of their government-backed research is leading to other thriving hubs of innovation around the country, from Austin and Houston, to Raleigh-Durham and Seattle, to Nashville and New Orleans.
Walter Isaacson is the author of “The Code Breaker: Jennifer Doudna, Gene Editing, and the Future of the Human Race,” to be published by Simon and Schuster on March 9.