Niall Ferguson joins Brian Anderson to discuss the false dichotomy of natural and man-made disasters, the true culprits in our problematic Covid-19 response, and the lessons from the pandemic for the next calamity. His new book is Doom: The Politics of Catastrophe.
Audio Transcript
Brian Anderson: Welcome back to the 10 Blocks Podcast. This is Brian Anderson, the editor of City Journal, and joining me today to discuss his latest book is indeed Niall Ferguson. He's the Milbank Family Senior Fellow at Stanford University's Hoover Institution and an acclaimed historian who's written books on everything from finance and social networks to Henry Kissinger and the British empire.
Last year was an unprecedented time, or so it seemed. As COVID-19 spread across the world, public officials cited the unique threat of the virus to justify extreme interventions in daily life, and then civil unrest and violence exploded in US cities in a kind of political or social contagion that accompanied the public health emergency. It was certainly a troubling year.
As Niall's book shows, however, disasters and crises are never entirely unprecedented. Political and natural catastrophes are often entwined, and we should try to understand the causes and characteristics of past calamities to help us grasp today's and perhaps better prepare for future disasters.
In his new booked called Doom, he investigates the common features of geological and atmospheric, political and geopolitical, biological and technological disasters with that goal in mind.
Throughout our conversation, please feel free to submit your questions on whatever platform you're watching this on, and we'll do our best to get to as many as we can.
So Niall, thanks very much for joining us today.
Niall Ferguson: It's a pleasure to join you, Brian.
Brian Anderson: You analyze in Doom dozens of historical disasters. These range from the Black Death during the 14th century and the Napoleonic wars of the 19th century to the Titanic sinking in 1911 and the Great Famine in Mao's China during the 20th century. These events happened across many different times and places, but what are the common features in your view, the recurring patterns, and where does COVID-19, the crisis surrounding it, rank among the disasters you discuss?
Niall Ferguson: Yes. It might seem rather eclectic array of unfortunate events, and you might wonder what business I have bringing wars and pandemics together with earthquakes and wildfires. But there are a couple of things that I think all disasters have in common, certainly the kinds of disasters I'm interested in. The obvious one is excess mortality, the sudden increase in mortality above what might have been expected based on our relatively recent experience, a sudden increase in the probability of premature death. That's the same whether you're confronted by a war or a pandemic, and I make the argument, which is really borrowed from Amartya Sen's argument about famines, that the distinction between the natural and the man-made catastrophes are forced dichotomy. In many ways, COVID-19 illustrates that really well, even if you don't believe the lab leak hypothesis, though that's looking more and more likely as an explanation of the origin of the pandemic. So that's the first idea, that we really can and should think about pandemics and wars within the same framework.
The second point that hit me when I was reading a book about the outbreak of World War I, Louis-Ferdinand Céline's extraordinary account at the beginning of Voyage au bout de la nuit. It's the fact, that to the individual caught up in a disaster, there is a strange sense of unreality. The unreality comes partly from the sense that it can't possibly be happening to you. It might possibly be happening to somebody else, but it can't really kill you. That's a really important and curious human quirk. We struggle a bit to grasp the idea of a suddenly increased probability of mortality that applies to us.
The other thing I think is quite important, that's the sense of confusion. That one is struggling to make sense of this unfolding disaster because it is very unfamiliar. It's a new kind of experience. Not many of us get to experience multiple disasters. Sometimes it happens. I think my grandfather went through a whole succession of disasters beginning with the First World War, but most of us get one big disaster or maybe two. When it happens, you're really thrown, and no matter how well educated you think you are.
So those are important common factors, and they explain a lot about our difficulty in dealing with disaster, even when we've attained much higher levels of scientific education than, say, Medieval peasants.
Brian Anderson: You discuss three different types of disasters in the book, black swans, gray rhinos, and dragon kings. What are these, and what distinguishes them? Could you just give a few brief examples of each?
Niall Ferguson: It sounds like a rather strange zoo, doesn't it? The idea here, these are other people's ideas that I've brought under one zoological roof, is that disasters can appear a little bit like the gray rhino that you see trundling towards you across the Serengeti. You kind of know it's coming for you, and you have some warning because you see it from some distance. And this is an idea that characterizes a lot of disasters that we see them coming. It's not as if a pandemic was totally unpredictable. People have been predicting a major pandemic for decades, and in fact, I list all the different TED Talks and op-eds and books that made the prediction that there would be a major pandemic. And it's dozens of them. So that's the gray rhino.
The odd thing is that when a gray rhino actually hits you, when the predicted disaster happens, a strange metamorphosis occurs and it's suddenly a black swan. Everybody's calling it unprecedented. This is a year like no other. Heard that many times at the end of 2020, and we act surprised, as if nobody could have possibly have foreseen this. The black swan is an idea Nassim Taleb pioneered in the book of that name some years ago. It's the thing that you really can't foresee because it lies outside your range of experience and also your distribution of probabilities. So that's an oddity that something that we talked about for years, when it actually happens in early 2020, took most people completely by surprise, as if we hadn't had all those gray rhino TED Talks telling us it was coming.
The final idea is the dragon king. Some disasters kill lots of people but don't have very major consequences. A good example of this is the 1957-58 influenza pandemic, which almost nobody remembers, including people who were around at the time. Killing a proportion of the world's population, not that different from COVID, but it's consequences you'd struggle to find in any history book. It's a non-event.
Other events kill lots of people and have huge consequences, and that's where this notion of a dragon king, which I borrowed from Didier Sornette comes in. It's the idea of an event that's sort of so huge that it lies beyond even a power-law distribution. I think the First World War is a good example of this because the First World War is significant not just because of the 10 million-plus people who died in conventional warfare. It's significant because of all the consequences that followed from it, like the Russian Revolution and the breakup of the three empires of Central and Eastern Europe.
So those are the three creatures that I use to try to organize a typology of disaster. The idea being that excess mortality alone doesn't really determine historical significance of an event.
Brian Anderson: People naturally attribute disasters, both human-caused and natural, or geological, or whatever to poor leadership. They blame the leaders in charge. But in one of your chapters, you write quite interestingly that the point of failure during a catastrophe is often not at the top but in the middle in a combination of errors committed by technical operators or middle managers. What are some historical examples of this, and in your view, was that the case in our response to COVID-19 as well?
Niall Ferguson: Well, this idea came from reading Richard Feynman's accounts of the space shuttle Challenger disaster in 1986. Feynman was a brilliant Cal Tech physicist who got brought in to the official inquiry and slightly disrupted it with his unorthodox, very non-Washington modes of inquiry. The key point about the space shuttle disaster was that the point of failure was in the middle of the NASA bureaucracy. Now, the press corps, when the disaster happened, did what it always does. It tried to pin responsibility on the president, and so there was a story that briefly did the rounds that the space shuttle launch had been hurried. It had been moved ahead too fast because Reagan wanted to mention it in his State of the Union. This was a total non-story. It fell apart pretty quickly.
But what had gone wrong, that was less obvious. Now it's true that to make it clear this wasn't the huge disaster in terms of loss of life; only seven people died—the crew of the Challenger. But it was a very big disaster in terms of its impact on public consciousness. I don't know, you may remember watching it on television. Many people watched the launch live and were sort of stunned to see the thing blow up seconds after launch.
Anyway, Feynman delved into the innards of NASA, and he found to his surprise that the engineers at NASA had known all along that there was a one in 100 chance the thing would blow up. I mean, this was clear to the engineers. It was cleary only a matter of time until something like this happened, because they were doing regular space shuttle launches at that point. But somewhere in the middle of the NASA bureaucracy, a mysterious figure, Mr. Kingsbury, had decided that it would be better to report that as one in 100,000 rather than one in 100. Feynman's argument is that ultimately it was the NASA bureaucracy's refusal to admit that the risk was one in 100 that led to the disaster.
There's a nice bit in Feynman's account where the engineers are complaining they could never get a meeting with Mr. Kingsbury. And for me, Mr. Kingsbury is a sort of symbolic figure, maybe a little bit like Woody Allen's Zelig. He's always there somewhere kind of in the middle of the management structure, just quietly changing the odds of failure in ways that are satisfying bureaucratically, but ultimately disastrous.
I looked at the Titanic in a similar kind of a spirit. At the time the ship went down, everybody hated on the chairman of the White Star Line whose life was more or less destroyed by the disaster. He basically became a recluse on the coast of Ireland and scarcely spoke. But it wasn't his fault that such a large number of passengers drown. If you want to know whose fault it was, you have to buy the book. I've learned not to give everything away in these calls, but it's not what you think because the Titanic has had a whole series of legend-like explanations attached to it, not least in the famous movie. The reality is, once again, one of those little middle management mishaps that proved disastrous.
Brian Anderson: And how do you see this in the context of COVID-19? Do you think that pattern played itself out in the current crisis as well?
Niall Ferguson: It did, except this time the story that it was all the president's fault has really stuck, and you can see why, because Trump made so many errors of judgment and said so many ludicrous things over the course of 2020. But for most liberal journalists, it was just a natural reflex to blame it on him. And you may remember Jim Fallows writing a piece in the Atlantic saying essentially the president was like the pilot of an aircraft. If the aircraft crashed, it was pilot error.
I must admit, I read this piece, and as I was reading it, I was thinking, "No, this is wrong." And the reason it's wrong is that being president of the United States is nothing like being a pilot, not even remotely because you're sitting atop this enormously complex bureaucracy. And this has been true for decades. When decisions get to the Oval Office, they've already been fought over to multiple levels of the bureaucracy, all the way up to the cabinet level.
So I thought that was a kind of misunderstanding of the nature of presidential power. And then I thought a bit more about it, and I realized that you could apply the Feynman principle. The way you do it is this: Why exactly did the US suffer very high excess mortality? Why have we got maybe 600,000 deaths that happened prematurely because of the pandemic? And the answers to that question go something like this: First, because the CDC utterly failed to ramp up testing. In fact, it made testing harder than it needed to be. So nobody knew right into April or May who had COVID in the United States.
Secondly, there was no attempt to create a contact tracing app of the sort that they used in places like South Korea and more recently Taiwan. That wasn't even seriously attempted by the big tech companies. Thirdly, there was a total failure to protect the vulnerable, particularly in elderly care homes. That again happened at the state level. That was really a failure of state governments. And finally, there was no effective enforcement of quarantines at any point, so that people who potentially infected just basically were able to do what they liked.
So all the things that really explain the excess mortality don't seem to me to be attributable to presidential decisions. This is not to exonerate Trump or defend him. He made, as I said, numerous errors of judgment. It's just that I don't think his errors of judgment were responsible for a really significant percentage of the death toll. The truth is what happened in the US last year and it was true in the UK and it was true in multiple Western countries, including countries without populist leaders. It was a terrible failure of the public health bureaucracy, which had on paper a pandemic preparedness plan of numerous plans in the case of the US. It's just that none of those plans worked.
If we tell ourselves that it was all the president's fault and getting a new president has solved the problem, and I've heard this argument made, then the next disaster, whatever form it takes, will probably expose a similar failure in the bureaucracy in a different part of the government.
So this is a really important argument. It's not a popular one because nobody wants to feel as if they're letting Trump off the hook, but in reality if we just say to ourselves, "If only Joe Biden had been president a year earlier, it would've been fine," then we really are diluding ourselves.
By the way, this kind of argument was recently made in the UK by Dominic Cummings, the former advisor to Boris Johnson, whose critique in a long, twisted thread and then in his testimony to parliamentary committee was basically this: The entire system had failed, not just the elected politicians, not specifically the Prime Minister, but the civil service had failed and the public health experts had failed. I think the same story is in fact true in the United States, and we should realize that.
Brian Anderson: You note in the book that politicians in democratic societies are structurally disincentivized from dealing with tail risks, unlikely tail risks anyway, long-term problems. Can you explain a bit why you see that as the case, and what's the alternative if we can't trust political leaders or society leaders to prepare us adequately for disaster? What's the alternative, if there is any?
Niall Ferguson: Well, I think there are two problems that democracies face. One is what Henry Kissinger called the problem of conjecture, which is that if you are as a leader confronted with a possibility of a disaster and you're told that by taking early but costly action you can preempt it and avoid it or alternatively do nothing and you might get away with it because it might not happen, it's not certain to happen, it's very tempting to go for option two and kick the can down the road. Why? Because the costs of option one are not likely to get you rewarded politically. People don't really vote for leaders who've averted disasters. There's no gratitude for a disaster that didn't happen. This is, I think, a fundamental problem of incentives in democracy.
We never really discuss why there wasn't another 9/11. But it's actually a really interesting question. Why there were no subsequent large scale terrorist attacks in the United States, and it's been 20 years. And nobody certainly gets any credit for that, even if we know why it happened. So I think that's part of the reason.
The other part of the reason is that in nearly all democracies a large and complex bureaucratic state has evolved, particularly in the last 50 years, much larger than was the case 100 years ago. And these bureaucracies have their own pathologies. They're very good at the CYA approach to disaster preparedness— that's the cover-your-ass" approach, where they produce preparedness plans that run for pages and pages, usually with an accompanying PowerPoint deck, and it looks as if the problems have been addressed. I think this is very clear in the case of COVID. There were numerous pandemic preparedness plans for multiple agencies. There was even an assistant secretary for preparedness. There's this great 2019 survey that the Economist Intelligence Unit publishes in concert with John Hopkins saying that the U.S. is the best prepared country in the world for a pandemic with the U.K. in second place. Of course, these preparations turned out to be pretty much worthless when an actual pandemic happened. So that's the other thing.
Now what can we do about this? I think the wrong answer to that question is we need to heed every Cassandra who has a prophecy of doom. One of the key points about this book is you can't predict the big disasters. They just don't lie in that realm where you can say with confidence there's going to be a pandemic in 2020. You can't really get much beyond there's going to be a pandemic. There's going to be a big earthquake in California one day, but anybody who tells you with great confidence that they know when it's going to be is probably a snake oil salesperson.
So I think the wrong approach is to say we need to heed every Cassandra and be prepared for every contingency. That's the kind of thing that bureaucracies find appealing, but in truth, you could waste an unreasonable amount of resources preparing for everything from the astroid hitting the planet to the zombie apocalypse.
So the right approach, and this is the answer to your question, is to emphasize rapid reaction because the countries that got this right or at least did best, Taiwan, South Korea, to some extent Israel, are countries that got this right acted very quickly. We were slow, and I think what we need to emphasize is not powers of prophecy but rapidity of reaction. I was very impressed when I was in Taiwan at the beginning of 2020. By the fact that they were sort of ready for all kinds of problems from China, including election interference at that time as they were running an election. But they were quick on the draw when there was this story about a new disease in Wuhan that mysteriously, according to the Chinese authorities, wasn't being transmitted from human to human. They kind of just didn't believe that and acted very swiftly to make sure that they could limit the spread of the virus within Taiwan. So I think that's the key.
Our bureaucracies are very slow in responding because that's really the way they've evolved. Great at the preparedness plan, very bad at executing it. I think that's fixable, but not if we learn the wrong lessons from 2020, which I think we're in the process of doing.
Brian Anderson: The vaccination effort shows, though, that I think free economies have certain advantages. The US and the UK were really at the forefront of developing the most effective vaccines, and it was thriving and innovative private industries with government help in this case that may have provided us our exit strategy from the pandemic.
Niall Ferguson: Yeah, if you're going to get one thing right in the pandemic, get vaccination right. As I was writing the book, remember, books aren't like newspapers. So it really was kind of finished in August and proofs were finalized in I guess October. It was before the phase three results came out from Pfizer and Moderna. But my hunch then was, and it proved to be right, that the Western vaccines would be a lot better than the Chinese vaccines, the Chinese promises to save the world with their vaccines. I regarded with great, and it turned out, justified skepticism. The Moderna and Pfizer results were even better than I'd expected. But they do illustrate the importance of not having highly centralized approach to problems of public health and the fact that there is still a very competitive biotech industry explains why mRNA vaccines exist. And those people who kind of look longingly at China in mid-2020 say, "Ah, if only we could be like them," I think really misunderstood the nature of the crisis, which after all had originated in China for a pretty good reason.
Brian Anderson: One of your most interesting chapters is on social networks, and you've written a previous book on this, that the structure of social and biological networks effects transmission patterns and everything from ideas to viruses. I think social media was really instrumental in getting international protests going over racial or at least perceived racial injustice in America while COVID-19 containment efforts involved massive interventions to disrupt the networks, the social networks that conveyed the virus. I wonder if there's a way to think about network science and networks to minimize the risks of either informational pandemics or biological pandemics.
Niall Ferguson: Well, it's a key question. My last book, The Square and the Tower, was about the monsters we've created that now dominate our public sphere, and these are network platforms whose business model—that's to sell ads—necessitates getting people's eyeballs on screens for as long as possible. That actually leads to algorithms that prioritize fake news and extreme views and conspiracy theories. Now, this was something I was deeply concerned about really from 2016, 2017 when I wrote that book, and I think our failure to address that problem left us very vulnerable to the info-demic that has ultimately made it very difficult for the US to defeat COVID-19. If there is a significant holdout of 25% or so of the population who just won't get vaccinated, it's not clear to me that the US can get to herd immunity because these new variants like the Delta variant coming your way—it's already widespread in the UK—will get these people, and that's because it's just way more contagious than the original so-called misnamed wild variant.
So that's, I think, a really important part of our story that we got a much, much worse information ecosystem than the Eisenhower administration had to contend with back in 1957 when a similar sized pandemic struck.
I think the lesson for me, and it's an important lesson, not only about information networks but also about networks of travel and transportation which are crucial in a pandemic, is that one needs circuit breakers to be in place. Given that contagion produces these very disastrous outcomes in the biological or medical world, we need much better circuit breakers than we seem to have. There should've been a much earlier suspension of travel from Wuhan than happened. It was insane that flights were still leaving, direct flights to New York and San Francisco and major European capitals, right down until January the 23rd. That was during the Chinese Lunar New Year holiday when enormous numbers of people were leaving Wuhan.
So I think the obvious step that we need to take is to think much more about how we can have rapid circuit breakers so that the network can temporarily be disrupted.
The interesting thing about COVID is the super-spreader feature that has a low dispersion factor. Eighty percent of the spreading is done by about 20% of the infected people, and if you could stop those super-spreaders from doing their work in the early phase of the pandemic, then you had a pretty good shot at containment. So that's one obvious takeaway.
The second and more tricky thing is what to do about the network platforms. They clearly dominate the public sphere and they haven't really reformed themselves in any, in my view, meaningful ways since 2016. There are lots of bad answers to this question, like, "Oh, let's have an anti-trust campaign against them," which is the Biden administration's option. This isn't going to fix anything. I mean, it's a complete, in my view, cul de sac, to try and solve these problems with anti-trust.
Another wrong answer is, let's just have a really powerful federal regulator that can squeeze the big tech companies harder. That, again, is highly unlikely to work on the basis of historical experience. So I argue for a kind of double-combination punch that just increases the liability of the companies. I mean, you have to do something about Section 230 so that they don't simply plead immunity every time anybody tries to sue them from a harm arising from content on the platform. And you need some kind of First Amendment right so that people can't be censored arbitrarily on political grounds. I think both of those things had been in place, the internet would have done and brought less harm than it did in 2020.
Brian Anderson: Well, Niall, I think we're nearing the end of our broadcast time today. I wanted to thank you very much for joining us at the Manhattan Institute and for an excellent discussion. And I want to thank all of the viewers who watched. Niall Ferguson's book Doom: The Politics of Catastrophe is out now. You can get it in all book stores and on Amazon, of course. If you'd like to hear about more conversation like today's or are interested in supporting the Manhattan Institute or City Journal, you can subscribe to MI's newsletters, City Journal itself, of course, or consider making a donation.
Brian Anderson: So thanks again, Niall, and we really appreciate your time today. Fascinating.
Niall Ferguson: Thank you so much, Brian.
Photo by David McNew/Getty Images