In a recent essay in The Atlantic, renowned social psychologist Jonathan Haidt argues that the “share” functions of large social media networks have poisoned American politics. Virality favors the vicious, Haidt writes, as a commanding majority of any Twitter dustup consists of the deluded, the angry, and the anxious. Yet even the lurkers afraid to weigh in on dangerous debates are not representative of the American people as a whole: the Twitter user base overwhelmingly hails from the partisan fringes. This concentration of extremism has a chilling effect on national discourse, as the smart and the centrist, subject to abuse, self-censor their way out of the debate. The fruits of faulty information architecture are Trumpism on the right, wokeness on the left, and distrust in American institutions everywhere else. Haidt urges a host of measures to kill this disease before it turns fatal to our democracy: ending closed primaries, compelling social media companies to verify user accounts, and encouraging kids to spend more time outdoors and less time online.
Like many centrist arguments of our age, Haidt’s essay emphasizes the procedural over the substantive. Missing from the 8,000-word argument, though, is an acknowledgement that meaningful national problems might also account for the fracture of American democracy. Credentialed experts, smart centrists, and small-l liberals of all stripes held the reins of power right up to 2016. They presided over a catastrophic misadventure in Iraq, two decades of defeat in Afghanistan, a hollowed industrial base, an earth-spanning recession, a secret surveillance regime, an opioid epidemic, and stagnating livelihoods in both the black communities of America’s inner-cities and the white communities of America’s rural hinterlands. With events like these, retweets are hardly needed to explain faltering faith in the American creed.
Haidt himself has a broader view of our current social and political fissures; he has written elsewhere about the psychological roots of moral and political disagreement. Still, arguments attributing responsibility for political discord to technology are increasingly common. The sober, expert set that these arguments are marketed to is generally reluctant to acknowledge America’s larger history of decline—much less the role they played in it. Imagine an essay written in 1975 seeking to make sense of New Left radicalism. The story of baby boomer extremism could be told by analyzing networks of professors and students on elite campuses—but if that story did not also mention Vietnam, Watergate, the end of the civil rights movement, or the 1970s energy crisis, the average reader would conclude that the essayist had lost that decade’s thread.
Today, too many baby boomer observers chalk radicalization up to the procedural machinery of twenty-first-century politics instead of their generation’s own leadership failures. The answers to hard questions such as, “How much immigration can America accept before it undermines our political order?” or “What must we sacrifice to bring black America the prosperity and security most Americans take for granted?” cannot be to stall, equivocate, or start talking about the retweet button.
This generational disconnect points to a plausible alternative mechanism for the radical edge that began taking hold in the 2010s. Sociologists have shown that the events of one’s youth are truly formative. The ideas, attitudes, and social pressures of one’s teenage and early adult years have a decisive impact on one’s worldview and political attitudes, even after the conditions that created these pressures have long disappeared. After hitting his mid-twenties, the political, social, and religious attitudes of the average American citizen are mostly locked in place. Generational churn is thus the engine of social revolution. Cultures do not change when people replace old ideas with new ones; they change when people with new ideas replace the people with old ones.
Often, as is the case with woke ideology, the ideas in question are themselves quite old. Their conquest thus appears both gradual and sudden. Beneath the official comings and goings of generational cohorts above, insurgent ideas and attitudes gradually take hold in the cohorts below. The importance of this revolution in ideas is initially obscured by the inability and inexperience of youth. But the youth do not stay young—eventually, a transition point arrives. Sometimes, this transition will be marked by a great event that old orthodoxies cannot explain. Other times, it is simply a matter of numbers. In either case, the older cohorts suddenly find themselves outnumbered and outgunned, swept away by a flood of ideas that they had assumed were an insignificant trickle.
The last 50 years of American life have been understood through the lens of the baby boomer experience. The boomers’ demographic weight and consumer power allowed them to maintain a hegemony over America’s national institutions longer than most generations in American history ever managed. It was only in the 2010s that this hegemony began to crack. This was the decade when both the boomers began to retire and the millennials entered the workforce en masse. As the demographic balance within national institutions shifted away from the boomers and toward the millennials, institutions such as the New York Times found themselves rent by internal dissension, and isolated younger boomers suddenly faced social pressure to bring their views in line with the new program.
The strength of this alternate explanation for the trajectory of the 2010s is that it makes sense of facts that the social network theory does not account for. A theory of social change built around social media engagement, for example, carries limited explanatory power in a country where most people do not use Twitter. How could the technological hypothesis explain the staid stability (notwithstanding the assassination of Shinzo Abe) of Japan, where 46 percent of the population—and 80 percent of those under 30—have a Twitter account? In contrast, only one in five Americans use Twitter, and, according to a recent Pew study of these Americans, only 35 percent of them log on to Twitter daily. Even heavy users of the service spend most of their time tweeting into the void: Pew reports that the top 25 percent of American tweeters, as based on tweet volume, received “an average of just 37 likes and one retweet per month.”
The people who matter on Twitter are a tiny portion of the country—too tiny to credit with a national decline in institutional trust (a decline that began not with social media, but after America’s first nasty summer in Iraq, in 2004). Nor can they be the reason that 62 percent of Americans are afraid to voice their political opinions, 55 percent have admitted to self-censorship sometime over the last year, and 32 percent fear that their political stance might cost them their job. These numbers are far larger than the population of Americans who use Twitter daily—and orders of magnitude larger than the Twitter users with follower counts large enough to fear cancellation.
Intellectuals with large social media followings can forget that the Twitter we experience is nothing like the median Twitter experience. Ninety-three percent of Twitter users have fewer than 100 followers. When these Americans report their fear of speaking their mind, it is not a mob of Internet “anons” that they are afraid of. They fear their real-life social set. When Saturday Night Live decided to satirize the censoriousness of 2010s America, it titled the skit “Dinner Party.” The show’s writers understood that for most of the audience, a gathering of friends at an upscale restaurant is more menacing than a Twitter mob. The battlefield for the American soul is not on social media. It is a war waged in boardrooms and bedrooms, over cocktails or between conference panels, in jokes said at backyard barbecues and in complaints filed to HR. It is in these spaces where dissent goes to die.
Self-censorship in a suffocating information environment is not a new feature in American life. Even the more innocuous decades of our history are a testament to it. In a 1922 issue of Harper’s, Katherine Fullerton Gerould wrote:
No thinking citizen can express in freedom more than a part of his honest convictions. . . . On every hand, free speech is choked off in one direction or another. The only way in which an American citizen who is really interested in all the social and political problems of his country can preserve any freedom of expression, is to choose the mob that is most sympathetic to him, and abide under the shadow of that mob.
Alexis de Tocqueville previewed this theme a century earlier by declaring that he did not “know any country where, in general, less independence of mind and genuine freedom of discussion reign than in America.” Among Americans, “the majority draws a formidable circle around thought,” he wrote. “Inside those limits, the writer is free; but unhappiness awaits him if he dares to leave them.” Those that so dare are “the butt of mortifications of all kinds of persecutions every day . . . everything is refused him, even glory.” As with the cancelled of today, stepping outside this circle of approved thought means that “those who blame him [will] express themselves openly, and those who think like him, without having his courage, keep silent and move away.” The American dissident is left no choice but to “bend under the effort of each day and return to silence as if he felt remorse for having spoken the truth.”
Tocqueville blamed the narrow range of acceptable speech on the social conditions of democracy. Aristocrats were protected from the force of social judgment by their rank and the economic security of their landed inheritance. With literacy rates high enough to guarantee mass participation in politics, social ranks level enough to preclude aristocratic deference, and wealth spread widely enough to instill an anxious spirit of bourgeoisie striving, America would never favor the heterodox. It never has.
If cancel culture has always been with us, what has changed over time is what an American can be cancelled for. Twenty years ago, country music bands were being cancelled for opposing the Iraq War and writers were being cancelled for refusing to put American flags on their doorstep. Today, a different set of offenses leads to ostracization, but the forms and consequences of ostracization are largely the same. The current moment feels so fraught not because cancellation has emerged as a new feature in American life but because of the sudden and dramatic shift in the range of views deemed cancel-worthy. For the first time in their lives, many Americans find themselves on the wrong side of Tocqueville’s circle. This shift is especially disorienting for centrists whose views had—for most of their lives—constituted the very definition of acceptable opinion.
Procedural reforms to social media will not reverse this shift. “Extremists” fill universities, law firms, corporate offices, consultancies, upper-class social circles, media organizations, and both political parties. It is not Facebook that tilts Fox News to the Trumpist right; and the war for the Gray Lady’s future is waged on Slack, not on Twitter. The central questions in American life do not revolve around the size of the radicals’ microphone but the substance of what the radicals are saying—and, one hopes, on the substance of what might be said to rebut them.
The trouble is that the national problems that the extremists fixate on are, for the most part, real. Their solutions are, for the most part, coherent and emotionally compelling. Those who believe these solutions are nevertheless wrongheaded must come out and prove them so. Someone who has determined, for example, that woke politics is destructive should use his wisdom and intelligence to demonstrate why the woke program trends toward disaster—and to provide saner solutions to the problems that wokeness purports to solve. Unlike procedural reforms meant to shore up the control of a passing demographic, these arguments might provide us with the intellectual tools needed to fight off the radicals even after boomer centrists exit the scene.
Photo: BeritK/iStock