Is it really true that America is politically divided between conservative “Red” states in the southern and middle sections of the country and liberal “Blue” states on both coasts? Not exactly: a close look at the district-by-district voting patterns of the coastal states in the 2004 elections brings into crystal-clear focus the real nature of our political divisions. There’s really no such thing as a Blue state—only Blue metropolitan regions. Indeed, the electoral maps of some states that went for John Kerry in 2004 consist mostly of Red suburban and rural counties surrounding deep Blue cities.
What makes these cities so Blue is a multifaceted liberal coalition that ranges from old-style industrial unionists and culturally liberal intellectuals, journalists, and entertainers to tort lawyers, feminists, and even politically correct financiers. But within this coalition, one group stands out as increasingly powerful and not quite in step with the old politics of the Left: those who benefit from an expanding government, including public-sector employees, workers at organizations that survive off government money, and those who receive government benefits. In cities, especially, this group has seized power from the taxpayers, as the vast expansion of the public sector that has taken place since the beginning of the War on Poverty has finally reached a tipping point. In New York City, this coalition has helped roll back some of the reforms of the Giuliani years. In California cities and towns, it is thwarting the expansion of private businesses, Wal-Mart above all. In nearly 100 municipalities, it has imposed higher costs on tens of thousands of businesses by persuading city councils to pass “living-wage” laws.
This increasingly powerful public-sector movement results from the merging of two originally distinct forces. First are the government-employee unions, born in the 1950s and nowadays the 800-pound gorillas of policy debates in many statehouses and city councils. Today, public unions don’t merely use their power to win contract
concessions for their members. They help elect sympathetic legislators and defeat proponents of smaller government; they lobby for higher taxes, especially on the rich and on businesses; and they oppose legislative efforts, such as privatization initiatives, aimed at making government smaller and more efficient.
For years, government employees had no right to organize, on the grounds that there was no competition in the delivery of essential government services and that therefore public unions could hold cities and states hostage by going on strike. Even some private-sector union leaders questioned the wisdom of letting public-sector workers organize and giving them the right to strike. But that began to change in the mid-1950s, when the American Federation of State, County, and Municipal Employees (AFSCME) began lobbying for the right of local workers to organize and bargain collectively. The organization scored its first major victory in 1958, when it persuaded New York City mayor Robert Wagner, looking
to strengthen his union support, to give municipal workers collective bargaining rights. Over the next several years, other states and cities, especially those with strong union movements, also passed laws allowing public employees to unionize. Buoyed by these victories, AFSCME’s membership rose from 100,000 in 1955 to 250,000 by 1965 and to more than 1 million by 1985.
Other government-employee organizations
followed AFSCME’s lead. In 1960 the American Federation of Teachers (AFT) set out to win collective bargaining rights for U.S. teachers, using Mayor Wagner and labor-friendly New York as
a test case. Though New York’s first teachers’ walkout, in November 1960, had little public
support, the union movement gained adherents among teachers nationwide, so that over the next five years there were 36 strikes against municipal school systems. In 1966 alone, another three dozen strikes occurred, as teacher militancy
rose in places like Newark, Baltimore, and Youngstown, Ohio. Meanwhile, membership in the AFT more than doubled to 136,000 from 1960 through 1966.
In retrospect, most of the warnings voiced in those tumultuous years proved accurate. Political leaders and labor experts predicted that government-employee unions would use their monopoly power over public services to win contracts with work rules far more generous and undemanding than in the private sector, and that without the restraints on salaries and benefits that the free marketplace imposes on private firms, unions would win increasingly meaty compensation and pension packages that would be impossible to roll back once enacted.
But what critics did not anticipate was how far public-employee unions would move beyond collective bargaining and inject themselves into the electoral and legislative processes. Today, the endorsement of
a public-sector union is crucial to the election of many local candidates, and public unions now often spend far more on lobbying and political advertising on local issues than any business group does. Nor could the critics have envisioned a time when a shrinking private-sector union movement would forge alliances with the public sector, and when the lines between the two would increasingly blur, as formerly private-sector unions, like the Service Employees International Union (SEIU), would come to
represent increasing numbers of health-care and other workers whose jobs depend on public money.
Reinforcing the public-employee unions in the powerful new coalition of tax eaters are the social-services groups spawned by the War on Poverty. Nominally private, they are sustained by and organized around public funding. Before the War on Poverty, most social-services agencies were privately funded and had little stake in government spending policies. Groups like Catholic Charities, for instance, received less than 10 percent of their support from government sources. But all that changed beginning in 1965, when federal spending on social services soared, increasing from $800 million to $2.2 billion between 1965 and 1970, and then rocketing to $13 billion by 1980. This geyser of money transformed many formerly private welfare organizations into government contractors, and their employees into quasi-public workers. It also spurred the creation of vast new networks of such organizations, as social-services entrepreneurs conjured into being a constellation of housing groups, subsidized day-care centers, employment-training programs, health clinics, and much more—all designed to tap into the new War on Poverty money.
This social-services funding vastly expanded the publicly supported workforce almost overnight. Before the 1970s, the government didn’t even count private social services as a sector, because it was so small. But in 1972, a Bureau of Labor Statistics employment census found 550,000 people working in the sector. By 1980, that number had more than doubled to 1.1 million. The sector’s upward arc has continued unabated since then, with especially fast growth during the 1990s. Today the field teems with some 3.3 million workers, most supported by government-funded programs. Whereas in the early 1970s private social services accounted for less than 1 percent of the American workforce, today it accounts for 3 percent of jobs.
Because the clientele for social services is concentrated in the big cities, much of the growth in social-services employment took place there, too. In New York, for example, social-services jobs increased from 52,000 to 183,000 between 1975 and 2000, so that by the end of the millennium more New Yorkers worked in social services than on Wall Street. In Philadelphia, social-services jobs more than doubled to nearly 30,000 from 1988 (the first year for which numbers are available) to 2000. In Boston, during the same period, these jobs increased by 67 percent to nearly 55,000, while in Chicago they grew by nearly 140 percent to 83,000. In all these cases, social-services employment grew much faster than the cities’ economies as a whole. And cities poured their own funds into such programs to augment the (much greater) federal spending, especially in the early 1980s, when the Reagan administration restrained the growth of federal social-services programs. New York City, for instance, increased its spending on programs for the homeless from $8 million in 1978, two years before Reagan beat Jimmy Carter, to $100 million annually by 1985. In the early 1980s, New York State increased its spending on alcohol and drug addiction programs alone by two-thirds to nearly $500 million.
Almost from the War on Poverty’s inception, these social-services employees and their clients began to show themselves a powerful political force, as when New York welfare workers, for example, mobilized recipients in the early 1970s to storm government offices demanding higher benefits. Some social-services agencies organized their employees and clients into grassroots political operations, parlaying their huge empires built on government and foundation money into political power. Ramon Velez, for example—whose Bronx network of government-funded health centers, alcoholism clinics, and other programs garnered over $300 million in government money over 25 years—engineered the election of several city council and state assembly members in the Bronx, and Velez himself served on New York’s city council.
At the same time that the War on Poverty was gearing up and federal spending on social services was beginning to soar, the Johnson administration created the two gigantic health-care programs, Medicaid and Medicare, providing care to the poor and the elderly, respectively. In the process, Washington vastly changed the economics of U.S. medical care, turning it increasingly into a government-funded industry. From the very start, Medicaid and Medicare, initiated within a year of each other, cost far more than anyone had expected, because they encouraged overuse of the health-care system, prompted overbilling by doctors and hospitals, and led
to widespread fraud. In less than five years, the federal budget for Medicaid rose to $6 billion from just $1.2 billion in its first year, 1966, while expenditures by the states, which shared the
program’s cost, also ballooned. With so much money pouring in, the country’s health-care industry mushroomed. In the entire decade before the federal programs began, U.S. health-care employment had increased by about 800,000 jobs, but in the first ten years of Medicaid and Medicare, the growth rate more than doubled, and the industry added more than 2 million new jobs, including more than 1 million in hospitals.
The new federal programs made many hospitals dependents of the state, especially in cities with the largest Medicaid populations. Within a few years, urban hospitals that had previously received very little federal money were living principally on Medicaid and Medicare. And once government had become a prime payer in the health-care system, hospitals that had low occupancy rates or duplicated services provided by other local institutions could survive on government dollars rather than being forced to close. Such institutions could gold plate their treatments of patients in order to increase revenues, so that hospitalizations and length of hospital stays increased. As a result, by 1980—to take only one example—experts estimated that New York City had 5,000 more hospital beds than it really needed. Also as a result, at least in part, health-care jobs grew from 3.9 percent of the U.S. private workforce in 1965 to nearly 10 percent today. Shrinking the bloated system and stemming abuses became politically impossible. Above all, what would become of all those who worked in hospitals that should be closed?
The gradual government takeover of health care—a process still continuing—has transformed the industry’s institutions, executives, and workers into lobbyists for ever-greater public monies and expanding programs, and tireless foes of efforts to restrain costs. Hospitals and health-care unions were the chief opponents of the Gingrich Congress’s efforts to balance the federal budget in the mid-1990s in part by cutting the growth of Medicaid and Medicare, and these special interests successfully derailed some of the steepest proposed cuts. At the state and local levels, especially in cities where the industry heavily depends on Medicaid and Medicare, hospitals and hospital workers have become two of the most influential power blocs. In New York State, for instance, a coalition of hospitals and unions spent $13 million in 1999, a record for Albany, lobbying to turn back cuts in the state’s huge Medicaid system. Dennis Rivera, the head of Local 1199, a New York City–based union of health-care workers, has become the most powerful union leader in the state, far more influential than the head of the state AFL-CIO.
The electoral activism of this New New Left coalition—public-employee unions, hospitals and health-care worker unions, and social-services agencies—has reshaped the politics of many cities. As the country’s national political scene has edged rightward, thwarting their ambitions in Washington, these groups have turned their attention to urban America, where they still have the power to influence public policy.
Increasingly in U.S. cities, the road to electoral success passes through the public-
employee/health-care/social-services sector. In New York, for instance, more than two-thirds of city council members are former government employees or ex-workers in health care or social services. The first Latino speaker of the California State Assembly, Antonio Villaraigosa—who
narrowly lost the 2001 election for mayor of
Los Angeles and served as a national co-chair of John Kerry’s presidential campaign—is a former organizer for the Los Angeles teachers’ union. Jane Campbell, the current mayor of Cleveland, snapped up a $3,000 grant back in 1974 to start WomenSpace, a feminist advocacy group, and used her role as executive director of the organization to launch a 20-year career in elective office in Ohio. Kansas City mayor Kay Barnes entered public life working as a paid staffer in the 1960s for the Cross-Lines Cooperative Council, a local social-services network, and she later helped found and run the Women’s Resource Service, an advocacy center on the campus of the University of Missouri–Kansas City.
One reason that these politicians have succeeded electorally is that those who work in the public sector have different voting priorities from private-sector workers or business owners. An exit poll conducted by City Journal of the 2001 New York mayoral election found that private-sector workers heavily backed Michael Bloomberg, the businessman candidate who had been endorsed by Rudy Giuliani and had run on a pledge of no new taxes (which he broke after his first year in office), while those who worked in the public/health-care/social-services sectors favored his Democratic opponent, who ran on a promise of raising taxes to fund further services. In the race, Bloomberg won among private-sector voters by 17 percentage points, while the Democrat won by 15 points among those who worked in the public/nonprofit sectors.
And of course public-sector workers, who know they are going to the polls to elect their bosses, make sure to remember to vote. Though they make up about one-third of New York City’s workforce, public/nonprofit-sector voters made up 37 percent of the electorate in the 2001 mayoral race. Minority workers who earned their living in the public sector were dramatically more likely than their private-sector counterparts to vote.
With so much of their economic future at stake in elections, the tax eaters have emerged as the new infantry of political campaigns, replacing the ward captains and district leaders of old-time political clubs. Today it’s the members of the New New Left, through their unions and community-based organizations, who are most likely to run political phone banks, distribute campaign literature and run get-out-the-vote efforts for their favored candidates. Indeed, when a member of the Local 1199 health-care workers union ran for New York’s city council in 2003, a local Democratic politician noted admiringly that the candidate, through her union colleagues, could field “a million foot soldiers.” And, like the old Tammany Hall and other urban political machines, these efforts have sparked complaints. Members of the New New Left advocacy group ACORN, which ran aggressive voter-registration drives in many cities during the 2004 elections, were accused of submitting fake or forged registrations in places such as Duluth, Cincinnati, and St. Petersburg, Florida.
Perhaps it’s not surprising that the urban Left has evolved into so narrow a movement, promoting no more than its own self-interest. Though it started out as a romantic, if wrongheaded, idea, the War on Poverty was the child of idealists who really believed that a benevolent, paternalistic government could offer solutions that America’s private economy couldn’t provide for the poor. But the most cherished ideals and programs of the movement have turned out to be demonstrably wrong, and many Americans now reject them. Unlimited welfare proved an economic and social disaster, producing an underclass of perpetual recipients who, after years on the dole, felt incapable of functioning as productive citizens. Liberalization of the criminal laws and judicial leniency, part of a War-on-Poverty mind-set that saw criminals as victims of society, only led to soaring crime rates, which drove law-abiding citizens out of cities and condemned those who could not leave to lives of fear. Government-funded alcohol and drug rehabilitation programs that placed little emphasis on personal responsibility and individual redemption had zero effect on the rise of addiction.
By the mid-1990s, Americans were eager for reform, and they got it. Changes in welfare law that imposed time limits on assistance and required recipients to work have turned out to be a great success, reducing public-assistance rolls and getting millions of people back to work, without raising the poverty rate. Tough, activist policing innovations have sharply reduced crime, freeing millions of Americans, especially those in inner cities, from fear.
In the face of such realities, the new urban Left has emerged as an increasingly cynical coalition ever more focused on goals that benefit its members and their allies, even though it retains the jargon of “social justice.” The living-wage movement is largely the work of unions more interested in laws that bolster union membership and derail privatization or productivity-boosting measures than in legislation that genuinely helps the poor. Many of the living-wage laws enacted around the country exempt unionized companies from adhering to wage guidelines, encouraging firms to unionize. Legislative bodies commandeered by these advocates have cynically enacted laws that have been a boon to their allies but have harmed the cities themselves, as for example the New York City Council’s passage of a living-wage law that raised the wages of home health-care workers but cost the city and state millions of dollars—in the midst of the city’s worst budget crisis ever. In the same spirit, in municipalities throughout California, the New New Left coalition has successfully advocated for laws that restrict consumers’ choices by making it difficult for Wal-Mart and other nonunion retailers to open in places where unionized stores predominate.
But by donning the mantle of “social justice” and invoking the liberation language of the 1960s—for example, in its campaigns to win domestic-partner benefits for municipal workers—the New New Left has managed to dupe a generation of celebrity liberals, idealistic young voters, and religious leaders who have become their allies in the Blue-state coalition. Actor buddies Matt Damon and Ben Affleck have campaigned for living-wage laws in their home state of Massachusetts, while clergy hold pro-living-wage religious services, blissfully unaware of how unions have hijacked the movement for their self-interested goals. The union representing television and radio actors urged its members to boycott California stores during the state’s 2003 supermarket-industry strike, and celebrities like actress Melissa Gilbert joined workers on the picket lines. Groups like ACORN rely in many of their campaigns on the volunteer labor of idealistic college students, who are unaware of the controversy that the organization generates by refusing to pay its own workers minimum wages and by using federal legislation like the Community Reinvestment Act to shake down banks.
Regardless of how transparent its aims now seem, this new coalition will remain formidable in the cities, because the tax-eater sector is now so large that it can easily thwart reforms aimed at undermining its programs. But the coalition is also becoming the real power in national campaigns, working both within the Democratic Party and outside it. AFSCME, the AFT, and SEIU were among the largest contributors to the Media Fund, a $65 million advertising effort aimed at defeating President Bush in 2004. Those groups, plus ACORN, also supplied much of the manpower for the national voter-registration effort aimed at defeating the president. A succession of Democratic presidential hopefuls traveled to New York to seek the blessing of 1199/SEIU union chief Dennis Rivera, who once held a seat on the Democratic National Committee, and when John Kerry picked his running mate, he immediately called SEIU boss Andrew Stern, a John Edwards supporter, to say, “I heard you.” About one in ten delegates to the 2004 Democratic National Convention was a member of a teachers’ union.
The tax-eaters’ party has seized control of many of America’s cities; now it is trying to make the next big leap.