Want to avoid the next pandemic? hire a devil’s advocate
Forcing governments and businesses to institutionalize doubt—by putting hackers and red teams on the payroll—would stop groupthink and could prevent catastrophes

What should the Catholic Church do if the pope decides it's a good idea to declare someone a saint but that person has weaknesses that the pope may have overlooked? For centuries, the Vatican had a solution for such situations: an official devil's advocate, who combed through every potential canonization for weaknesses. For centuries—from 1587 until Pope John Paul II effectively abolished the office in 1983—the Vatican's devil's advocate (advocatus diaboli) was the probing official whose task it was to argue against the pro-canonization side, represented by God's advocate (advocatus dei), in the canonization process of would-be saints.
That seemingly arcane practice is highly relevant today. Governments and companies knew about the risk of a pandemic, but they did too little. To prevent the next one, they need designated devil's advocates on the state payroll. And to better anticipate devious attacks in cyberspace, they should hire another group of contrarians, too: hackers.
The United States' botched pandemic preparation and response offer a case in point. "For five days, the president along with some of his closest senior officials disseminated an egregiously false message to Americans," Ryan Goodman and Danielle Schulkin wrote in a New York Times op-ed on April 28, referring to the last days of February. "We have contained this. I won't say airtight but pretty close to airtight," Larry Kudlow, the director of the National Economic Council, announced on CNBC on Feb. 25.
Even Anthony Fauci, the much-admired director of the National Institute of Allergy and Infectious Diseases and the government's top doctor in the coronavirus battle, stuck to President Donald Trump's argument that the coronavirus had been contained at the time. "No. Right now, at this moment, there's no need to change anything that you're doing on a day-by-day basis," Fauci responded when asked whether Americans should change their habits. Trump had, of course, consistently claimed the virus was under control and that any suggestions to the contrary amounted to fake news.
But what if Fauci had corrected the president's policy before it reached the public—and had been tasked explicitly with second-guessing the White House? Knowing Trump's history, he likely would have been fired, but in a less dysfunctional government, such a role could have saved thousands of lives and billions of dollars in economic losses.
Although the White House was particularly slow in treating the coronavirus as a full-blown crisis, most other organizations got the virus threat wrong, too. The danger of a pandemic has long featured on national risk registers, a report of civil emergency risks published by some governments, including Britain's, to inform and prepare the public.
But most governments or businesses, never mind the public, did little to prepare for this seemingly unlikely scenario. "Risk registers are an essential tool for risk management, but they can backfire with their overwhelming amount of detail," said Hélène Galy, who directs the Willis Research Network, the research arm of the global insurance broker Willis Towers Watson. "A risk register can also give the illusion that a comprehensive overview of risks, and an assessment of their impact and likelihood, means that these risks have been dealt with." In reality, of course, a risk register is simply words on a page.
One insurance broker even offered stand-alone pandemic insurance two years ago—and had no takers. Instead, businesses and households are trying to make coronavirus claims on other types of insurance. So far U.K. insurers have paid out 1.2 billion pounds ($1.5 billion) connected to coronavirus-related cases in which people have claimed on business interruption insurance, wedding insurance, travel insurance, and the like. But claiming the coronavirus on your wedding insurance policy is not going to fly in most cases.
If anything, the coronavirus crisis has brutally demonstrated that it's exceedingly risky to ignore unlikely calamities. The problem is: Who's going to point out that the seemingly sensible strategy won't work? In most organizations, that person is a self-appointed devil's advocate, whom everybody dislikes because he or she sees the gaps in everyone else's ideas or decisions.
That's where modern governments and businesses need to learn from the Catholic Church. The point of the devil's advocate was, of course, to save the Vatican from embarrassing blunders by identifying less-than-saintly traits among those being canonized before the public noticed.
The devil's advocate even spotted faults in the life of the Maid of Orléans before she was eventually declared a saint. (Though the office no longer exists, the adversarial probing of saints does; the due diligence is now carried out by a range of officials with less intimidating titles.)
If governments and businesses had official devil's advocates, they too could avoid the extremely costly risk of intellectual complacency.
"It's important to hard-wire challenge into the policymaking process, and that requires opening up both procedural and psychological safe spaces to do so," said Tim Dowse, a former director of intelligence and national security in the British Foreign Office. "It was the failure to sufficiently question accepted wisdom that lay behind a lot of the U.K. intelligence community's mistakes in the Iraq WMD [weapons of mass destruction] case."
Some governments have tried devil's advocate-like schemes. In 1976, U.S. President Gerald Ford's advisors, unconvinced by the CIA's assessment that the Soviet Union was content to be at military parity with the United States, persuaded the president to appoint a second assessment team. The so-called Team B concluded that the Soviets were, in fact, intent on achieving superiority—and its strong reasoning caused the CIA to change its assessment.
After 9/11, the CIA and the Department of Homeland Security (DHS) launched Red Cells, a scheme already used by other parts of the U.S. government. But in the case of DHS, the participants— "people with offbeat specialties and life experiences," the Washington Post reported in 2004—provided rather ad hoc input.
"Typically the Red Cell team assembles 20 or so participants for a day-long session at leased offices in the Washington area. Each session divides into smaller groups and takes up a different question, such as: If you were a terrorist, how would you target the G-8 economic summit," the Post reported. "Another recent topic was: Why haven't terrorists hit the United States since Sept. 11, 2001?" As for the CIA's Red Cell, "it always felt to me more like an interesting academic exercise than something with much effect on policy, though no doubt in London we didn't see all its products," Dowse said.
Dowse still sees merit in the concept but said a team "deliberately employed to be difficult is always going to have a problem getting impact. There will be a risk of driving those under scrutiny into a defensive huddle. Of course, a lot depends on the authority and credibility that the individuals themselves carry." What's more, the devil's advocate slows decision-making down.
That's an incentive to be even more innovative. Why not appoint outsiders as official devil's advocates? "It would be helpful to designate someone independent," Galy said. "It couldn't be a permanent appointment either, as over time the devil's advocate's critical thinking could mellow."
Outside reality-checkers parachuted in for a limited period of time are needed in another capacity, too: to help governments and businesses anticipate so-called grayzone attacks, attacks below the threshold of military ones (think cyberattacks, disinformation campaigns, disruption of supply chains). In military operations, anticipating your adversary's moves is relatively easy: Your intelligence collects information about troop and equipment movements, which gives you a good idea of what's afoot. Indeed, during World War II, the staff drafting Britain's Joint Intelligence Committee assessments were tasked with seeing the world through the eyes of Nazi Germany.
But it's not as straightforward to anticipate when or how hackers might probe a country's voting system. "My job was to worry about every parade of horribles. So I cannot tell you that that did not cross my mind. … We were worried about the supply chain for the voting machines. Who were the makers?" Lisa Monaco, former U.S. President Barack Obama's homeland security advisor, told the Senate Intelligence Committee's inquiry on Russian election interference in 2017.
Monaco was a capable government official, but she was just that: a government official, supported by other government officials. To have a better chance of anticipating grayzone aggression by Russia, China, Iran, North Korea, and their proxies, governments of liberal democracies—and businesses operating there—need people who think more like the aggressors.
Many government agencies and large businesses already have red teams that play the aggressor side in tabletop exercises simulating aggression—but the red team, playing the opponent, usually comprises members of the organization or a related outfit. The DHS Red Cell comprised people from outside the government, not would-be terrorists. Neither is wired to think like the enemy.
A more promising solution is emerging: involve almost adversaries. At a recent event I hosted at the Royal United Services Institute, a senior BT Security executive explained how his company works with police forces to identify hackers who have nearly crossed over to the dark side and reeducate them as "ethical hackers." More companies could do the same. So—after security clearance—could governments.
With hackers recruited from the almost dark side playing the red team, governments and businesses would have a much better chance at anticipating their adversaries' next move. And aided by the groupthink-busting services of devil's advocates likewise recruited from the outside, governments and businesses alike would slash their risk of follies such as underestimating a pandemic or overestimating WMD intelligence.
In the U.K., a group of eminent scientists have just set themselves up as a volunteer devil's advocate to provide independent—and perhaps contrarian—coronavirus advice to the government. Will the government listen to them? It may not. But at least the scientists have a good chance of spotting holes in government policy before the public does. The same is true for ethical hackers: Though companies are not obliged fix vulnerabilities detected by the almost devils, it's clearly in their interest to do so. Considering today's inevitable probing by both adversaries and self-appointed devil's advocates among the wider public, it's vastly preferable to have a proper one on one's own team.
International security is changing. It's time to learn from both the Vatican and teenagers in basements.
Elisabeth Braw directs the Modern Deterrence project at the Royal United Services Institute. Twitter: @elisabethbraw
Disclaimer: This article first appeared on Foreign Policy, and is published by special syndication arrangement.