The Five Laws began with a drink. Late one night in 2005, I sat down for a whiskey with Sir James Black, who won the 1988 Nobel Prize in medicine for developing the beta-blockers for hypertension and histamine antagonists for ulcers. After a marathon all-day session advising me and a small team of scientists on our cancer research, when I was close to collapsing from exhaustion and wondering how a man in his 80s could outlast me, I mumbled something about how hopeless one project seemed after a couple of failures in the lab.
Sir James leaned over, patted my knee, and said, “Ah, my boy — it’s not a good drug unless it’s been killed at least three times.”
I’ve come to think of that lesson as the First Law: Expect the Three Deaths.
The big ideas, the ones that change the course of science or business or history, rarely arrive with blaring trumpets dazzling everyone with their brilliance. They pass through long dark tunnels of skepticism and uncertainty, crushed or neglected, their champions written off as crazy. There wasn’t a good word for that in the English language, so I made one up: loonshots.
For example, when President John F. Kennedy announced to Congress in 1961 his goal of putting a man on the moon, he was widely applauded. Four decades earlier, though, when Robert Goddard described the principles that might get us to the moon — jet propulsion and rocket flight — he’d been widely ridiculed. The New York Times wrote that Goddard “seems to lack the knowledge ladled out daily in high schools,” namely that of Newton’s law on action and reaction, which made rocket flight in space impossible. (The day after the successful launch of the Apollo 11 rocket to the moon, 14 years after Goddard’s death, the paper announced that rockets did not in fact violate the laws of physics and that “the Times regrets the error.”)
Kennedy’s speech marked the original moonshot. Goddard’s idea was a classic loonshot. A moonshot is a destination. Nurturing loonshots is how we get there.
Failing to understand the surprising fragility of the loonshot — assuming that the best ideas will blast through barriers, fueled by the power of their brilliance — can be a very dangerous mistake, as the US learned in the Second World War: The US military ignored Goddard. German scientists did not. They built on his principles to develop the first long-range missiles, which reigned terror on London, and the first jet aircraft, which flew over 100 miles per hour faster than any Allied plane. Fortunately for the Allies, those breakthroughs came too late in the war to make a difference.
In business, failing to understand that surprising fragility can be a very expensive mistake. In the case of Akira Endo’s discovery, the price tag was roughly $300 billion.
Endo is the scientist behind arguably the greatest drug discovery of the 1970s: the statins. These cholesterol-lowering drugs—Lipitor, Crestor, Zocor, and others—have helped prevent millions of heart attacks, contributing to a three-decade decline in deaths from heart disease.
Not long after identifying the first statin, Endo began the standard small-animal studies used to evaluate the safety and efficacy. The honor of being first is generally awarded to rodents. With great excitement, his team gave the drug to rats and saw … nothing. In drug discovery, a failure of that kind nearly always kills a project.
Which leads us to the Second Law: Mind the False Fail.
At a bar near his lab, Endo ran into a colleague who used chickens in his research. After a few drinks, the colleague confided that his chickens would make a nice dish of yakitori when his project ended. It occurred to Endo that hens might have high blood cholesterol, since their eggs have so much of it, which could make the effects of his drug easier to detect. So Endo convinced his friend to curb his appetite, temporarily, and test the new drug on some spare hens.
The results were spectacular. Endo’s drug decreased cholesterol by nearly half, triglycerides by even more.
Years later, scientists would learn that rats have mostly HDL (“good cholesterol”) circulating in the blood, and very little LDL, the “bad cholesterol” that contributes to heart disease. Which means that rats are a poor choice for evaluating statins, which lower just the LDL. Chickens have both types, like humans.
The failure of statins in rodent models was a False Fail — a result mistakenly attributed to the loonshot but actually a flaw in the test. The same failure permanently killed a similar program at another company, Beecham Pharmaceuticals. Had Beecham persisted, they might have shared in the cumulative $300 billion of revenues from statins.
The False Fail routinely masks promising loonshots, both in science and business. In 2004, for example, when Facebook launched, many social networks had tried and failed to win the loyalty of users who hopped from one network to the next: Sixdegrees, StumbleUpon, Delicious, Tribe, and so on. As Mark Zuckerberg met with investors to raise funds for his new startup, users were just beginning to abandon the most recent social network success story, Friendster, for MySpace. Most investors concluded the websites were like clothing fads. Users switched networks like they switched jeans.
Peter Thiel, however, reached out to friends behind the scenes at Friendster. Like other users, he knew that the site crashed often. He also knew that Friendster had received, and ignored, crucial advice on how to scale — how to transform a site built for a few thousand users into one that could support millions. He asked for the user retention data. He was stunned by how long users stayed despite the irritating crashes.
He concluded that users weren’t leaving because social networks were weak business models. They were leaving because of a software glitch. It was a False Fail.
Thiel wrote Zuckerberg a check for $500,000. Eight years later, he sold most of his stake in Facebook for roughly a billion dollars.
Which leads us to the Third Law: Listen to the Suck with Curiosity (LSC). When you’ve poured your soul into a project, the temptation to dismiss or reject bad outcomes is high. You attack your challengers and turn for reassurance to your friends, mentors, mother.
It’s hard to hear that no one likes your baby. It’s even harder to keep asking why.
Well before Endo convinced his friend with the pre-yakitori chickens to administer last-rite statin, Endo had been investigating why his experiments had failed. He already suspected the drug might behave very differently across animal species. He acted quickly when the opportunity appeared.
Where others assumed the decline of Friendster was the end of a fad, Thiel investigated why users were leaving and found a contrarian answer, in which he had confidence. Contrarian answers, with confidence, create very attractive investments.
All of which brings us to the billion-dollar question. The massive weight of teams and committees, where any failure is an excuse to terminate, will crush fragile loonshots. So how can large organizations possibly succeed?
The greatest system ever invented to solve this puzzle not only helped turn the course of a war but has helped the US lead the world in science and technology ever since.
Had there been prediction markets in early 1939, on the brink of world war, the odds would have favored Nazi Germany. Germany’s new submarines, called U-boats, threatened to dominate the Atlantic and strangle supply lines to Europe. The planes of the Luftwaffe, ready to bomb Europe into submission, outclassed those of any other air force. The discovery of nuclear fission, by two German scientists, put Hitler within reach of a weapon with almost unfathomable power.
Aware of the growing threat, Vannevar Bush, Dean of Engineering at MIT, quit his job and talked his way into a meeting with President Roosevelt.
The US military, Bush told FDR, trailed far behind Germany in the technologies that would be critical to winning the coming war, and was incapable of catching up in time. He handed FDR a single sheet of paper with a proposal: FDR should authorize a new science and technology group within the government, to be led by Bush, reporting only to the president. FDR listened, read the proposal, and signed it “OK — FDR.” The meeting had lasted all of ten minutes.
Bush understood not only that military culture would quash fragile loonshots, and that he would never succeed in changing that culture — but that he shouldn’t try. The tight discipline of a military organization would be essential to building munitions at an unprecedented rate, distributing troops and supplies across four continents, and directing millions of soldiers in battle.
It’s the Fourth Law: Forget culture; create an innovative structure.
The structure Bush created — a quarantined group of scientists and a system for dynamic and rapid exchange of ideas and projects between the scientists in the lab and the soldiers in the field — helped develop the technologies that created decisive advantages for the Allies. Shortly after the end of the war, in October 1945, Congress declared that without Bush’s system, “it is safe to say that victory still would await achievement.”
Although Bush was a brilliant inventor and engineer, he pointedly stayed out of the details of any one loonshot. The genius-entrepreneur who builds a long-lasting empire on the back of his ideas and inventions is a widespread myth.
The ones who truly succeeded, like Bush, have played a much humbler role. Rather than a holy leader standing high atop a mountain, raising his staff, anointing the chosen loonshot, Bush was a careful gardener. He created a nursery to shelter and grow fragile loonshots. He recognized that the weak link in the chain of innovation is not the supply of new ideas, but the transfer of those ideas to the field. So he managed the transfer rather the technology: ensuring that loonshots are brought into the field neither too early nor too late. He intervened only when that transfer broke down.
Transfer in the opposite direction is equally important. Early aircraft radar, for example, nearly failed. Bush nudged scientists into cockpits to see why pilots weren’t using it. In the heat of battle, the scientists discovered, pilots had no time for the early boxes’ complicated switches. The technology was fine; the user interface was lousy. Scientists quickly created a custom display — the sweeping line and moving dots now called PPI. Within four weeks, Allied planes sank one third of the German U-Boat fleet. Six weeks later, the head of the German Navy declared defeat in the Battle of the Atlantic. The lanes were cleared for an Allied invasion of Europe.
Bush personified the Fifth and final Law: Be a gardener, not a Moses.
Toward the end of the war, FDR asked Bush to summarize how the US could use his system to identify new treatments for diseases, improve national well-being, and grow the economy.
Bush’s report, called Science: The Endless Frontier, released in July 1945, caused a sensation. (It was hailed by media outlets, except for the New York Times, which questioned the authors’ understanding of science and concluded by suggesting a better model: “Soviet Russia has approached this task more realistically.”)
Over the next few years, spurred by the success of Bush’s system and the ideas in his report, Congress created the National Science Foundation, the National Institutes of Health, and many other research agencies. Since the end of WWII, that national research system has spawned GPS, personal computers, the biotechnology industry, the internet, pacemakers, artificial hearts, magnetic resonance imaging, even the chemotherapy cure for childhood leukemia. Many others industry-creating innovations — the transistor, for example, which launched the electronics age — were the joint offspring of public and private research.
The teams and companies who wish to lead their industries as the US has led the world in science and technology would do well to set aside the Fail Fast and Pivot mantra. Instead, they should reflect upon the words of Sir James Black and apply the Five Laws of Loonshots.
How to Nurture the Crazy Ideas That Win Wars, Cure Diseases, and Transform Industries
Loonshots, an instant WSJ bestseller, has been translated into 18 languages and selected as an Amazon, Bloomberg, Financial Times, Forbes, Inc., Medium, Newsweek, Strategy + Business, Tech Crunch, and Washington Post Best Business Book of the year. It was the #1 Most Recommended Book of the Year in Bloomberg’s annual survey of CEOs and entrepreneurs.
Loonshots reveals a surprising new way of thinking about the mysteries of group behavior that challenges everything we thought we knew about nurturing radical breakthroughs.
Using examples that range from the spread of fires in forests to the hunt for terrorists online, and stories of thieves and geniuses and kings, Bahcall shows how a new kind of science can help us become the initiators, rather than the victims, of innovative surprise.
Over the past decade, researchers have been applying the tools and techniques of this new science—the science of phase transitions—to understand how birds flock, fish swim, brains work, people vote, diseases erupt, and ecosystems collapse. Loonshots is the first to apply these tools to radical breakthroughs and distill the insights into practical lessons creatives, entrepreneurs, and visionaries can use to change our world.
Along the way, readers will learn what James Bond and Lipitor have in common, what the movie Enigma Game got wrong about WWII, and what really killed Pan Am, Polaroid, and the Qing Dynasty.
Read the Cliff notes on Twitter
Buy on Amazon