In the 1970s and 1980s, supply-side economics offered novel solutions to America’s ills, which included declining productivity, rising inflation, burdensome taxes, and a growing number of citizens dependent on government. Though supply-side advocates promised no magic solutions, they argued that a combination of deregulation, stable monetary policy, lower top tax rates, and reform of welfare programs would revive the American economy. Both political parties embraced at least parts of these reforms, laying the foundation for decades of prosperity.

The United States in 2024 faces many similar problems—and from some of the same causes. Relearning the lessons of supply-side economics is thus essential for restoring America’s economic dynamism. Yet today’s obstacles come in distinct forms. Government is less likely to regulate prices but more inclined to micromanage the everyday activities of citizens. The Federal Reserve is more aware of inflationary dangers but also more committed to directing credit and financing bailouts. Top marginal tax rates at the federal level are lower, but the government has hugely expanded deficit spending. Transfers are no longer confined to the disadvantaged but now creep far up the income ladder and are more concentrated on health care and the elderly.

A new free-market agenda must grapple with these fundamental changes in government and the economy. While embracing the insights of supply-side economics, it must learn how to apply them to a different age. Previous reforms—among them, the broad deregulation of the late 1970s and 1980s, the Fed’s fight against inflation in the early 1980s, the Tax Reform Act of 1986, and the 1996 welfare-reform legislation—brought immense benefits to average Americans. New reforms can deliver similar benefits—if they address the new problems we face.

Many forget, or are unaware of, just how nonsensical the regulations were on many of America’s businesses in the 1970s. Trucks had to get permits from the Interstate Commerce Commission to ship loads from one state to another, and they could do so only at predetermined government rates. If a trucker couldn’t get a permit on the way back, he would have to “deadhead,” that is, drive with an empty trailer. Airlines also had to apply for permits and rates—in their case, to the Civil Aeronautics Board. In deference to incumbent airliners, the board kept ticket prices absurdly high, so carriers tried to compete with perks like steak cooked to order and “piano bars” in the sky. That may sound enticing, but the nosebleed prices excluded most Americans from air travel and proved extraordinarily wasteful. The average plane flew barely half full.

Other regulations kept the economy sluggish and services overpriced. The largest corporation on earth was AT&T, which had maintained a regulated telephone monopoly for almost a century. In the early 1980s, its profits exceeded those of any corporation in history, as consumers had no choice but to use its product. Laws prevented banks from branching and, for a while, even banned new ATMs as a type of “branch.” With little competition, banks followed what was known as the 3-6-3 rule—borrow at 3 percent, lend at 6 percent, and hit the golf course by 3 pm. For incumbents in these regulated industries, life was comfortable and profitable. But consumers paid the price for that complacency.

Beginning in the 1970s, a burst of bipartisan deregulatory fervor swept the nation. Many realized that government regulation was holding back the “supply side,” or the production side, of the economy and raising prices for consumers. Ralph Nader teamed up with free-market economists and Senator Ted Kennedy to push airline deregulation. When one constituent from working-class East Boston reportedly asked the Massachusetts senator why he was working on airlines, since, as with many of the man’s neighbors, “I’ve never been able to fly,” Kennedy answered: “That’s why I’m holding hearings.”

Soon Congress removed many strictures on trucks, airlines, buses, railroads, banks, and telecommunications—and the benefits became apparent. Over the 30 years following airline deregulation, for instance, the cost per mile flown dropped by half and the number of fliers tripled. Even with a sharp reduction in actual miles of railroad track, the number of tons traveling by rail more than doubled and costs fell. And though some warned that banking deregulation would create an industry dominated by giant monopolists, local competition intensified, as banks, which still number in the thousands in the U.S., could now enter formerly protected markets.

Before deregulation, artificially high prices put airline travel out of most Americans’ reach. The average plane flew barely half full. (WATFORD/Mirrorpix/Getty Images)

In the decades since these supply-side reforms, however, other parts of the American economy have faced tighter controls. Today’s regulations are less likely to use price and entry mandates, known as “economic” regulation, and more likely to govern broad swaths of the American economy via “social” regulation: proliferating paperwork and minute rules on health and safety and various social goals.

As the Regulatory Studies Center at George Washington University shows, the burden of regulation has gotten heavier by various measures. The number of pages in the Code of Federal Regulations has gone from 100,000 in the early 1980s to more than 180,000 today. The budget of regulatory agencies has swelled from under $20 billion to about $70 billion, while staffing at such agencies has doubled, to almost 300,000, from its deregulation-era nadir. Estimates of the annual cost of regulations, just at the federal level, keep rising, ranging as high as $3 trillion annually. Meantime, lawsuits and other federal efforts have pushed companies to focus more on “equity” and on ending “discrimination,” as opposed to production.

The unfortunate effects of these regulatory burdens are visible, in ways large and small. They often involve filling out countless reports and documents. It can take up to ten years to complete government environmental reviews for major infrastructure projects. As I was writing this article, the Department of Agriculture released rules “to add disclosures and information that live poultry dealers engaged in the production of broilers must furnish to poultry growers.” The Food and Drug Administration mandated enhanced disclosure requirements for prescription drugs. Though seemingly a minor reporting issue, the new rule could cost the equivalent of up to $393 million over the next decade, the FDA estimated. The Biden administration has been on an unprecedented regulatory tear, imposing new regulatory costs of about $1.7 trillion, affecting everything from water-heater energy efficiency to nursing-home staffing. The deregulatory champions of yesteryear would be shocked by this proliferation of pettifogging mandates.

The inflationary crisis of the 1970s was a result of bad Federal Reserve policy, even if many in that institution were loath to admit it. Economists at the Fed and elsewhere endorsed a simple Keynesianism and believed that lower interest rates and more money-printing could solve most economic problems. When the money-printing only spurred inflation, even as the economy stagnated—the dreaded economic condition dubbed “stagflation”—these economists were baffled. Many—including then head of the Fed Arthur Burns—said that central banks could do nothing about inflation, which they blamed on greedy businesses’ price-gouging and unions demanding exorbitant wages. Burns’s and others’ belief in such “cost-push” inflation led to disastrous government price controls, triggering everything from shortages on supermarket shelves to long gas lines. By decade’s end, inflation had climbed above 13 percent a year.

Supply-side economists countered that Fed officials were ignoring a stark division in the economy. Supply-side reforms like deregulation and lower tax rates were the only things that would truly spur new production and economic growth. The Federal Reserve and other central banks, by contrast, could control only the “demand side” of the economy—basically, the amount of money and spending. By issuing so much money, the Fed was not spurring growth, as it thought; it was just inciting inflation, as the extra money chased the same amount of goods. When Paul Volcker took the helm of the Fed in 1979, he sought to reorient the institution around the core mandate of controlling demand. By raising interest rates and restraining the money supply, he brought inflation under 2 percent, though not without wrenching short-term pain to the economy. He has since been celebrated as one of the great heroes in the pantheon of central bankers.

“Washington and the Fed have shown ever more solicitude for failed banks of all sorts, as seen in the bailouts of Silicon Valley Bank and Signature Bank.”

In the decades that followed, the Federal Reserve seemed to be doing its job. Inflation stayed low and stable, and recessions were mild. But the 2008 financial crisis and Great Recession changed all that. In 2008, the Fed proved unable to counter a sharp collapse in spending and the greatest unemployment rise in a generation. Then, after the 2020 Covid pandemic, it let inflation hit its highest level since Volcker was in office.

At least the Federal Reserve today knows that it can curb the inflation that it helped cause, an improvement over the 1970s. But its failure to manage both the 2008 crisis and the pandemic’s inflationary spike resulted partly from a shift in focus from stabilizing inflation to picking winners and losers in the economy. Pre–financial crisis, the Fed’s balance sheet was less than $1 trillion, largely made up of short-term government debt. This ballooned to almost $4.5 trillion after that crisis; after the pandemic, it hit almost $9 trillion. The Fed now holds trillions in long-term Treasury and mortgage-backed bonds. In both crises, it established various facilities to aid depositors, money markets, and corporate-bond issuers; during the pandemic, Congress even granted it the power directly to support corporations and municipal borrowers. Over the last year, these risky strategies led the Fed to suffer losses exceeding $100 billion.

The government as a whole has worked, often alongside the Fed, increasingly like a giant bank, expected to support financiers and borrowers of all kinds. Periodic debt-forgiveness campaigns notwithstanding, for instance, federal student loans now total more than $1.5 trillion. Washington supports an expanding array of financial government-sponsored enterprises, including mortgage behemoths Fannie Mae and Freddie Mac. These two institutions alone have accumulated $7.5 trillion in assets, up significantly since they were bailed out during the financial crisis. Washington and the Fed have also shown ever more solicitude for failed businesses and banks of all sorts, as seen in the recent bailouts of Silicon Valley Bank and Signature Bank, which the government estimated would cost about $20 billion.

Political rhetoric in the 1980s and afterward often accused conservatives of “slashing” social programs and “eviscerating” welfare. Yet reforms in social spending in this period were relatively moderate, compared with other supply-side changes. Two pieces of federal legislation sum them up: the 1981 Omnibus Budget Reconciliation Act and the 1996 Personal Responsibility and Work Opportunity Reconciliation Act. These laws sought to slow the continuous, decades-long increase in government domestic spending since the New Deal.

Less than a month after newly elected president Ronald Reagan was shot by a deranged assailant, he addressed a joint session of Congress on his economic plan. Though the 70-year-old had suffered a broken rib, a punctured lung, and internal bleeding in John Hinckley Jr.’s assassination attempt, he came before Congress appearing miraculously hale. Cheers, whistles, and a standing ovation greeted his entrance, and he deftly used the goodwill to push his plans for budget cuts. The resulting budget act reduced spending in areas such as Social Security, Medicare, food stamps, and public housing. The early estimates were that the law would save $130 billion—the most significant budget cut since the defense drawdown after World War II.

The “welfare reform” that culminated in the 1996 legislation signed by President Bill Clinton primarily targeted Aid to Families with Dependent Children, a program that consumed only about 1 percent of the federal budget but was criticized for fostering government dependency and discouraging work among single mothers. As onetime congressional aide Ron Haskins detailed in his 2007 book Work over Welfare, an emerging bipartisan consensus paved the way to a law modestly restraining spending and, more importantly, significantly strengthening work requirements. This shift was reflected in the renaming of the program to Temporary Assistance for Needy Families (TANF). By the end of the century, despite lower welfare disbursements, the resulting growth in employment among former welfare recipients reduced the poverty rate among single mothers by more than 25 percent from its early 1990s peak.

These spending successes were temporary, however. Reagan’s early reductions notwithstanding, total federal spending on domestic programs kept mounting throughout his years in office. Reagan managed to reduce federal domestic spending only slightly as a percentage of Gross Domestic Product (GDP). By 2008, that measure was back to pre-Reagan levels. It has expanded rapidly since then. Meantime, the limitations on TANF haven’t been enough to overcome the expanding number and generosity of other government welfare initiatives. Spending on means-tested welfare, which distributes funds based on family or household income, has gone from about $400 billion in 1990 (in inflation-adjusted dollars) to well over $1 trillion in the years before Covid, with further increases since.

The greatest change in spending since Reagan came into office has been an explosion in government health care. In the 1980s, total government spending on health care was about 3 percent of GDP. It’s now over 8 percent and going higher. In the 1980s, health care made up a bit over 10 percent of the federal budget (excluding interest payments). It’s now almost 30 percent. A big reason for the shift: an aging America, with too few children born. Americans over 65 were 11 percent of the population in 1980; they are 17 percent today and will exceed 20 percent in the next decade. The “birth dearth” also means proportionally fewer young people around to shoulder the burden of future eldercare.

Beyond increased spending for health care and the elderly, government transfers are much more likely to go to the middle class than before. Government payments to non-elderly middle-income households grew from 40 percent to 50 percent of all transfers from the early 1980s to before Covid hit, while the proportion of means-tested transfers to such households went from 25 percent to 50 percent. In an earlier era, supply-siders pointed to poor adults and single mothers as serious drains on the public fisc. In 2024, the main burden is retired and middle-class adults, with both groups absorbing expansive funding for an ever less efficient government health-insurance system. It’s not obvious that these groups truly benefit from these programs, since they also pay substantial taxes for this expanded welfare state.

Ronald Reagan ran for president as a tax-cutter because he had experienced how high taxes can dampen ambition. “When I was in the movies,” he explained, “I’d reach a point each year where, after the second movie, I’d be in the 90 percent bracket. So I wouldn’t make any more movies that year. And it wasn’t just me, but [Humphrey] Bogart and [Clark] Gable.” He wanted to cut taxes to give more people a reason to work harder and longer.

At the time, in 1980, the top federal marginal tax rate was 70 percent—an obvious disincentive to work. Reagan’s 1981 cuts reduced that to 50 percent. And his second-term 1986 tax reform slashed the rate to 28 percent, even as government revenues remained stable, thanks to the elimination of loopholes, deductions, and credits in the tax code—“tax expenditures,” as they’re known—worth almost 3 percent of GDP. For a while, it looked as though the era of high tax rates was over for good. Yet federal politicians have since pushed the top federal rate back up to around 40 percent. Tax expenditures also started growing again after the 1986 reforms.

Though an inveterate tax-cutter, Reagan wanted to reduce the deficit, too. In a 1985 speech, he talked in near-apocalyptic terms about the deficit, contending that the United States faced a “rendezvous with history,” where the “threads of our past, present, and future as a nation will soon converge on the single overriding question. . . . Can we at last, after decades of drift, neglect, and excess, put our fiscal house in order?” That year, he signed the Gramm-Rudman-Hollings Act, which aimed to keep spending in line with taxes. It did little to restrain spending, though, and neither did subsequent efforts. Indeed, the greatest failure of Reagan’s presidency, which he acknowledged in his farewell speech from the Oval Office, was letting the deficit explode.

America’s deficit has only worsened since. In the early 1980s, the deficit exceeded 5 percent of GDP for the first time in the postwar period. Beginning in the 2000s, new crises brought even larger deficits.

After the financial crisis, the deficit neared 10 percent of GDP; during the pandemic, it almost hit 15 percent. These shortfalls have added to America’s swelling debt, the accumulated total of past deficits. Our debt was around 25 percent of GDP in the 1970s, a postwar low; it is almost 100 percent of GDP today. The interest costs on this borrowed money now exceed what the government spends on defense. While the combination of high domestic spending and high tax rates once presented the biggest threat to economic growth, America’s debt mountain looms largest today.

The Biden administration has been on an unprecedented regulatory tear, imposing new costs of about $1.7 trillion. (Geopix/Alamy Stock Photo)

The supply-side victory of the past was substantial in the areas that it sought to reform. Price controls, once widespread, now seldom apply across economic sectors, though revival efforts do occur. The Federal Reserve now actively combats inflation, no longer believing that it’s powerless to do so. Many ineffective welfare initiatives have been overhauled to include work requirements, and top marginal tax rates remain substantially lower than in the past. Yet ever greater government intervention in other areas have more than offset these previous gains.

What would a twenty-first-century supply-side agenda look like?

Start with reining in the federal bureaucracy. Reformers once aimed to remove regulatory power from the courts and Congress, which had demanded more onerous rules, and instead place it with the executive branch. Presidents Jimmy Carter and Ronald Reagan pursued deregulation, so the move toward executive authority then made sense. The famous, or infamous, Chevron v. NRDC case of 1984, which gave regulators carte blanche to decide the limits of their own power, actually involved the Supreme Court’s approval of a deregulatory effort by the Environmental Protection Agency against green activists who wanted the courts to tighten air-pollution rules.

Things are different four decades later. Executive-branch regulators have become too numerous, independent, and politicized for even deregulatory presidents to control them. Since these bureaucrats now handle fewer questions about simple price and quantity (e.g., “Should airline tickets go up or down 5 percent?”) and more about how far their reach extends (e.g., “Can the EPA control occasional puddles that form in people’s yards?”), courts need to enforce limits on the authorities. Though the Supreme Court recently struck down the so-called Chevron doctrine, courts can better supervise Congress’s delegation of power to bureaucrats. Congress should also consider new laws in the spirit of the REINS Act, which the House passed in 2023, requiring it to approve regulations that have a significant economic effect.

The Volcker-era Federal Reserve restored the primacy of price stability, but the Fed has subsequently extended its ambit to many new areas. Congress should get the central bank back to its core mission by giving it a clear mandate to keep prices and aggregate demand stable. The Fed chair should suffer the consequences if that mandate isn’t met. Congress should require the agency to keep a more limited balance sheet, instead of continuing indefinitely to buy and hold trillions in bonds.

Congress has shown little desire to limit vast entitlement spending on health care and the elderly. But constraining the welfare state’s growth is still possible. Instead of extending health-care subsidies to high earners, as in President Biden’s American Rescue Plan, for instance, benefits could be directed more carefully to the poor. In recent years, Congress has often put overall spending caps on all discretionary programs—those that require regular appropriations—and this should continue. But such programs now represent just 25 percent of the federal budget. Social Security, Medicare, and Medicaid entitlements alone now consume over 40 percent of the budget. At some point, we will have to cap these programs’ growth.

The current level of debt and deficits restricts the government’s ability to cut taxes without reducing spending. Still, opportunities for reform remain. Tax expenditures consume over 6 percent of GDP, lower than before Reagan’s 1986 reforms, yet still substantial. Congress could further reduce tax rates by limiting large individual deductions, as demonstrated by the recent caps on state and local tax and mortgage-interest deductions in the 2017 Tax Cuts and Jobs Act.

Simultaneously reducing taxes and government transfer payments could yield benefits without growing the deficit. I have elsewhere calculated, for example, that nearly 20 percent of U.S. transfer payments effectively return to the government as tax revenue from the same households within the same year. Eliminating these redundant “netting taxes” would lighten the financial load on households and the administrative burden on the government, with no adverse impact on the budget.

Other challenges have arisen. While U.S. average tariff rates have more than halved since the 1980s, even as sporadic trade conflicts broke out, one of our major trading partners today is our greatest geopolitical foe: the People’s Republic of China. Meanwhile, union membership in the private sector has dwindled from about 20 percent in 1980 to 6 percent today, yet unions remain strong in the public sector, representing one-third of workers and accounting for half of all union members. Supply-side reformers need to respond to these and other new realities.

Although at the moment prominent members of both parties denigrate the supply-side movement, we cannot forget the movement’s achievements from the 1970s through the 1990s. With substantial bipartisan support, policymakers successfully addressed many seemingly insuperable problems, such as high inflation and high tax rates. The entrepreneurial dynamism that those reforms unleashed was swift and transformative. Americans still have the same entrepreneurial spirit. We just need to remove the new barriers holding it back.

Top Photo: Ronald Reagan ran for president as a tax-cutter because he had experienced how high taxes can dampen ambition. (Diana Walker/Getty Images)

Donate

City Journal is a publication of the Manhattan Institute for Policy Research (MI), a leading free-market think tank. Are you interested in supporting the magazine? As a 501(c)(3) nonprofit, donations in support of MI and City Journal are fully tax-deductible as provided by law (EIN #13-2912529).

Further Reading

Up Next