AMERICA IS NOT A DEMOCRACY

How the United States lost the faith of its citizens—and what it can do to win them back
By Yacha Mounk
Atlantic March 2018

For years, the residents of Oxford, Massachusetts, seethed with anger at the company that controlled the local water supply. The company, locals complained, charged inflated prices and provided terrible service. But unless the town’s residents wanted to get by without running water, they had to pay up, again and again.

The people of Oxford resolved to buy the company out. At a town meeting in the local high-school auditorium, an overwhelming majority of residents voted to raise the millions of dollars that would be required for the purchase. It took years, but in May 2014, the deal was nearly done: One last vote stood between the small town and its long-awaited goal.

The company, however, was not going down without a fight. It mounted a campaign against the buyout. On the day of the crucial vote, the high-school auditorium swelled to capacity. Locals who had toiled on the issue for years noticed many newcomers—residents who hadn’t showed up to previous town meetings about the buyout. When the vote was called, the measure failed—the company, called Aquarion, would remain the town’s water supplier. Supporters of the buyout mounted a last-ditch effort to take a second vote, but before it could be organized, a lobbyist for Aquarion pulled a fire alarm. The building had to be evacuated, and the meeting adjourned. Aquarion retains control of Oxford’s water system to this day.

The company denied that the lobbyist was acting on its behalf when he pulled the alarm; it also denies that its rates were abnormally high or that it provides poor service. Some Oxford residents supported Aquarion, and others opposed the buyout because they feared the cost and complication of the town running its own water company. But many residents, liberal and conservative, were frustrated by the process. The vote, they felt, hadn’t taken place on a level playing field.

“It was a violation of the sanctity of our local government by big money,” Jen Caissie, a former chairman of the board of selectmen in Oxford, told me. “Their messiah is their bottom line, not the health of the local community. And I say that as a Republican, someone who is in favor of local business.”

A New England town meeting would seem to be one of the oldest and purest expressions of the American style of government. Yet even in this bastion of deliberation and direct democracy, a nasty suspicion had taken hold: that the levers of power are not controlled by the people. “The preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy.”

It’s a suspicion stoked by the fact that, across a range of issues, public policy does not reflect the preferences of the majority of Americans. If it did, the country would look radically different: Marijuana would be legal and campaign contributions more tightly regulated; paid parental leave would be the law of the land and public colleges free; the minimum wage would be higher and gun control much stricter; abortions would be more accessible in the early stages of pregnancy and illegal in the third trimester.

The subversion of the people’s preferences in our supposedly democratic system was explored in a 2014 study by the political scientists Martin Gilens of Princeton and Benjamin I. Page of Northwestern. Four broad theories have long sought to answer a fundamental question about our government: Who rules? One theory, the one we teach our children in civics classes, holds that the views of average people are decisive. Another theory suggests that mass-based interest groups such as the AARP have the power. A third theory predicts that business groups such as the Independent Insurance Agents and Brokers of America and the National Beer Wholesalers Association carry the day. A fourth theory holds that policy reflects the views of the economic elite.

Gilens and Page tested those theories by tracking how well the preferences of various groups predicted the way that Congress and the executive branch would act on 1,779 policy issues over a span of two decades. The results were shocking. Economic elites and narrow interest groups were very influential: They succeeded in getting their favored policies adopted about half of the time, and in stopping legislation to which they were opposed nearly all of the time. Mass-based interest groups, meanwhile, had little effect on public policy. As for the views of ordinary citizens, they had virtually no independent effect at all. “When the preferences of economic elites and the stands of organized interest groups are controlled for, the preferences of the average American appear to have only a minuscule, near-zero, statistically non-significant impact upon public policy,” Gilens and Page wrote.

Outlets from The Washington Post to Breitbart News cited this explosive finding as evidence of what overeager headline writers called American oligarchy. Subsequent studies critiqued some of the authors’ assumptions and questioned whether the political system is quite as insulated from the views of ordinary people as Gilens and Page found. The most breathless claims made on the basis of their study were clearly exaggerations. Yet their work is another serious indication of a creeping democratic deficit in the land of liberty.

To some degree, of course, the unresponsiveness of America’s political system is by design. The United States was founded as a republic, not a democracy. As Alexander Hamilton and James Madison made clear in the Federalist Papers, the essence of this republic would consist—their emphasis—“IN THE TOTAL EXCLUSION OF THE PEOPLE, IN THEIR COLLECTIVE CAPACITY, from any share” in the government. Instead, popular views would be translated into public policy through the election of representatives “whose wisdom may,” in Madison’s words, “best discern the true interest of their country.” That this radically curtailed the degree to which the people could directly influence the government was no accident.

Only over the course of the 19th century did a set of entrepreneurial thinkers begin to dress an ideologically self-conscious republic up in the unaccustomed robes of a democracy. Throughout America, the old social hierarchies were being upended by rapid industrialization, mass immigration, westward expansion, and civil war. Egalitarian sentiment was rising. The idea that the people should rule came to seem appealing and even natural. The same institutions that had once been designed to exclude the people from government were now commended for facilitating government “of the people, by the people, for the people.”

The shifting justification for our political system inspired important reforms. In 1913, the Seventeenth Amendment stipulated that senators had to be elected directly by the people, not by state legislatures. In 1920, the Nineteenth Amendment gave women the vote. In 1965, the Voting Rights Act, drawing on the Fifteenth Amendment, set out to protect the vote of black Americans. The once-peculiar claim that the United States was a democracy slowly came to have some basis in reality.

That basis is now crumbling, and the people have taken notice. In no small part that’s because the long era during which average Americans grew more wealthy has come to a sputtering stop. People who are asked how well they are doing economically frequently compare their own standard of living with that of their parents. Until recently, this comparison was heartening. At the age of 30, more than nine in 10 Americans born in 1940 were earning more than their parents had at the same stage of their lives. But according to eye-popping research led by the economist Raj Chetty and his co-authors, many Millennials do not share in this age-old American experience of improving fortunes. Among those Americans born in the early 1980s, only half earn more than their parents did at a similar age.

Americans have never loved their politicians or thought of Washington as a repository of moral virtue. But so long as the system worked for them—so long as they were wealthier than their parents had been and could expect that their kids would be better off than them—people trusted that politicians were ultimately on their side. Not anymore.

The rise of digital media, meanwhile, has given ordinary Americans, especially younger ones, an instinctive feel for direct democracy. Whether they’re stuffing the electronic ballot boxes of The Voice and Dancing With the Stars, liking a post on Facebook, or up-voting a comment on Reddit, they are seeing what it looks like when their vote makes an immediate difference. Compared with these digital plebiscites, the work of the United States government seems sluggish, outmoded, and shockingly unresponsive.

As a result, average voters feel more alienated from traditional political institutions than perhaps ever before. When they look at decisions made by politicians, they don’t see their preferences reflected in them. For good reason, they are growing as disenchanted with democracy as the people of Oxford, Massachusetts, did.

The politician who best intuited this discontent—and most loudly promised to remedy it—is Donald Trump. The claim that he would channel the voice of the people to combat a corrupt and unresponsive elite was at the very core of his candidacy. “I am your voice,” Trump promised as he accepted his party’s nomination at the Republican National Convention. “Today, we are not merely transferring power from one administration to another or from one party to another,” he proclaimed in his inaugural address, “but we are transferring power from Washington, D.C., and giving it back to you, the people.”

Donald Trump won the presidency for many reasons, including racial animus, concerns over immigration, and a widening divide between urban and rural areas. But public-opinion data suggest that a deep feeling of powerlessness among voters was also important. I analyzed 2016 data from the American National Election Studies. Those who voted for Trump in the Republican primaries, more than those who supported his competition, said that they “don’t have any say about what the government does,” that “public officials don’t care much what people like me think,” and that “most politicians care only about the interests of the rich and powerful.”

Trump has no real intention of devolving power back to the people. He’s filled his administration with members of the same elite he disparaged on the campaign trail. His biggest legislative success, the tax bill, has handed gifts to corporations and the donor class. A little more than a year after America rebelled against political elites by electing a self-proclaimed champion of the people, its government is more deeply in the pockets of lobbyists and billionaires than ever before.

It would be easy to draw the wrong lesson from this: If the American electorate can be duped by a figure like Trump, it can’t be trusted with whatever power it does retain. To avoid further damage to the rule of law and the rights of the most-vulnerable Americans, traditional elites should appropriate even more power for themselves. But that response plays into the populist narrative: The political class dislikes Trump because he threatens to take its power away. It also refuses to recognize that the people have a point.

America does have a democracy problem. If we want to address the root causes of populism, we need to start by taking an honest accounting of the ways in which power has slipped out of the people’s hands, and think more honestly about the ways in which we can—and cannot—put the people back in control.

At the height of the mexican–american war, Nicholas Trist traveled to Mexico and negotiated the Treaty of Guadalupe Hidalgo, which ended the hostilities between the two nations and helped delineate America’s southern border. Two decades later, the U.S. government still hadn’t paid him for his services. Too old and weak to travel to Washington to collect the money himself, Trist hired a prominent lawyer by the name of Linus Child to act on his behalf, promising him 25 percent of his recovered earnings.

Congress finally appropriated the money to settle its debt. But now it was Trist who refused to pay up, even after his lawyer sued for his share. Though the contract between Trist and Child hardly seems untoward by today’s standards, the Supreme Court refused to uphold it out of fear that it might provide a legal basis for the activities of lobbyists:

If any of the great corporations of the country were to hire adventurers who make market of themselves in this way, to procure the passage of a general law with a view to the promotion of their private interests, the moral sense of every right-minded man would instinctively denounce the employer and employed as steeped in corruption.

Extreme as this case may appear, it was far from idiosyncratic. In her book Corruption in America, the legal scholar Zephyr Teachout notes that the institutions of the United States were explicitly designed to counter the myriad ways in which people might seek to sway political decisions for their own personal gain. Many forms of lobbying were banned throughout the 19th century. In Georgia, the state constitution at one time read that “lobbying is declared to be a crime.” In California, it was a felony.

Over the course of the 20th century, lobbying gradually lost the stench of the illicit. But even once the activity became normalized, businesses remained reluctant to exert their influence. As late as the 1960s, major corporations did not lobby directly on their own behalf. Instead, they relied on collectives such as the U.S. Chamber of Commerce, which had a weaker voice in Washington than labor unions or public-interest groups. “As every business executive knows,” the future Supreme Court Justice Lewis F. Powell Jr. complained in 1971, “few elements of American society today have as little influence in government as the American businessman.” “I always knew the system was dysfunctional,” said Congressman Steve Israel. “Now it is beyond broken.”

All of this began to change in the early 1970s. Determined to fight rising wages and stricter labor and environmental standards, which would bring higher costs, CEOs of companies like General Electric and General Motors banded together to expand their power on Capitol Hill. At first, their activities were mostly defensive: The goal was to stop legislation that might harm their interests. But as the political influence of big corporations grew, and their profits soared, a new class of professional lobbyists managed to convince the nation’s CEOs that, in the words of Lee Drutman, the author of the 2015 book The Business of America Is Lobbying, their activity “was not just about keeping the government far away—it could also be about drawing government close.”

Today, corporations wield immense power in Washington: “For every dollar spent on lobbying by labor unions and public-interest groups,” Drutman shows, “large corporations and their associations now spend $34. Of the 100 organizations that spend the most on lobbying, 95 consistently represent business.” (Read about a principal architect of the lobbying industry—Paul Manafort—in our March 2018 cover story.)

The work of K Street lobbyists, and the violation of our government by big money, has fundamentally transformed the work—and the lives—of the people’s supposed representatives. Steve Israel, a Democratic congressman from Long Island, was a consummate moneyman. Over the course of his 16 years on Capitol Hill, he arranged 1,600 fund-raisers for himself, averaging one every four days. Israel cited fund-raising as one of the main reasons he decided to retire from Congress, in 2016: “I don’t think I can spend another day in another call room making another call begging for money,” he told The New York Times. “I always knew the system was dysfunctional. Now it is beyond broken.”

A model schedule for freshman members of Congress prepared a few years ago by the Democratic Congressional Campaign Committee instructs them to spend about four hours every day cold-calling donors for cash. The party encourages so many phone calls because the phone calls work. Total spending on American elections has grown to unprecedented levels. From 2000 to 2012, reported federal campaign spending doubled. It’s no surprise, then, that a majority of Americans now believe Congress to be corrupt, according to a 2015 Gallup poll. As Israel memorably put it to HBO’s John Oliver, the hours he had spent raising money had been “a form of torture—and the real victims of this torture have become the American people, because they believe that they don’t have a voice in this system.”

Big donors and large corporations use their largesse to sway political decisions. But their influence goes far beyond those instances in which legislators knowingly sacrifice their constituents’ interests to stay on the right side of their financial backers. The people we spend time with day in and day out shape our tastes, our assumptions, and our values. The imperative to raise so much money means that members of Congress log more time with donors and lobbyists and less time with their constituents. Often, when faced with a vote on a bill of concern to their well-heeled backers, legislators don’t have to compromise their ideals—because they spend so much of their lives around donors and lobbyists, they have long ago come to share their views.

The problem goes even deeper than that. In America’s imagined past, members of Congress had a strong sense of place. Democrats might have risen through the ranks of local trade unions or schoolhouses. Republicans might have been local business or community leaders. Members of both parties lived lives intertwined with those of their constituents. But spend some time reading the biographies of your representatives in Congress, and you’ll notice, as I did, that by the time they reach office, many politicians have already been socialized into a cultural, educational, and financial elite that sets them apart from average Americans. While some representatives do have strong roots in their district, for many others the connection is tenuous at best. Even for those members who were born and raised in the part of the country they represent, that place is for many of them not their true home. Educated at expensive colleges, likely on the coasts, they spend their 20s and 30s in the nation’s great metropolitan centers. After stints in law, business, or finance, or on Capitol Hill, they move to the hinterlands out of political ambition. Once they retire from Congress, even if they retain some kind of home in their district, few make it the center of their lives: They seem much more likely than their predecessors to pursue lucrative opportunities in cities such as New York, San Francisco, and, of course, Washington. By just about every metric—from life experience to education to net worth—these politicians are thoroughly disconnected from the rest of the population.

The massive influence that money yields in Washington is hardly a secret. But another, equally important development has largely gone ignored: More and more issues have simply been taken out of democratic contestation.

In many policy areas, the job of legislating has been supplanted by so-called independent agencies such as the Federal Communications Commission, the Securities and Exchange Commission, the Environmental Protection Agency, and the Consumer Financial Protection Bureau. Once they are founded by Congress, these organizations can formulate policy on their own. In fact, they are free from legislative oversight to a remarkable degree, even though they are often charged with settling issues that are not just technically complicated but politically controversial. In 2007, Congress enacted 138 public laws. In the same year, independent federal agencies finalized 2,926 rules.

The range of crucial issues that these agencies have taken on testifies to their importance. From banning the use of the insecticide DDT to ensuring the quality of drinking water, for example, the EPA has been a key player in fights about environmental policy for almost 50 years; more recently, it has also made itself central to the American response to climate change, regulating pollutants and proposing limits on carbon-dioxide emissions from new power plants.

While independent agencies occasionally generate big headlines, they often wield their real power in more obscure policy areas. They are now responsible for the vast majority of new federal regulations. A 2008 article in the California Law Review noted that, during the previous year, Congress had enacted 138 public laws. In the same year, federal agencies had finalized 2,926 rules. Such rules run the gamut from technical stipulations that affect only a few specialized businesses to substantial reforms that have a direct impact on the lives of millions. In October 2017, for example, the Consumer Financial Protection Bureau passed a rule that would require providers of payday loans to determine whether customers would actually be able to pay them back—potentially saving millions of people from exploitative fees, but also making it more difficult for them to access cash in an emergency.

The rise of independent agencies such as the EPA is only a small piece of a larger trend in which government has grown less accountable to the people. In the latter half of the 20th century, the Federal Reserve won much greater independence from elected politicians and began to deploy far more powerful monetary tools. Trade treaties, from nafta to more-recent agreements with countries such as Australia, Morocco, and South Korea, have restricted Congress’s ability to set tariffs, subsidize domestic industries, and halt the inflow of certain categories of migrant workers. At one point I planned to count the number of treaties to which the United States is subject; I gave up when I realized that the State Department’s “List of Treaties and Other International Agreements of the United States” runs to 551 pages.

Most of these treaties and agreements offer real benefits or help us confront urgent challenges. Whatever your view of their merit, however, there is no denying that they curtail the power of Congress in ways that also disempower American voters. Trade treaties, for example, can include obscure provisions about “investor–state dispute settlements,” which give international arbitration courts the right to award huge sums of money to corporations if they are harmed by labor or environmental standards—potentially making it riskier for Congress to pass such measures.

This same tension between popular sovereignty and good governance is also evident in the debates over the power of the nine unelected justices of the Supreme Court. Since the early 1950s, the Supreme Court has ended legal segregation in schools and universities. It has ended and then reintroduced the death penalty. It has legalized abortion. It has limited censorship on television and the radio. It has decriminalized homosexuality and allowed same-sex marriage. It has struck down campaign-finance regulations and gun-control measures. It has determined whether millions of people get health insurance and whether millions of undocumented immigrants need to live in fear of being deported.

Whether you see judicial review as interpreting the law or usurping the people’s power probably depends on your view of the outcome. The American right has long railed against “activist judges” while the American left, which enjoyed a majority on the Court for a long stretch during the postwar era, has claimed that justices were merely doing their job. Now that the Court has started to lean further right, these views are rapidly reversing. But regardless of your politics, there’s no question that the justices frequently play an outsize role in settling major political conflicts—and that many of their decisions serve to amplify undemocratic elements of the system.

Take Citizens United. By overturning legislation that restricted campaign spending by corporations and other private groups, the Supreme Court issued a decision that was unpopular at the time and has remained unpopular since. (In a 2015 poll by Bloomberg, 78 percent of respondents disapproved of the ruling.) It also massively amplified the voice of moneyed interest groups, making it easier for the economic elite to override the preferences of the population for years to come.

Donald trump is the first president in the history of the United States to have served in no public capacity before entering to the White House. He belittles experts, seems to lack the most basic grasp of public policy, and loves to indulge the worst whims of his supporters. In all things, personal and political, Plato’s disdainful description of the “democratic man” fits the 45th president like a glove: Given to “false and braggart words and opinions,” he considers “insolence ‘good breeding,’ license ‘liberty,’ prodigality ‘magnificence,’ and shamelessness ‘manly spirit.’ ”

It is little wonder, then, that Plato’s haughty complaint about democracy—its primary ill, he claimed, consists in “assigning a kind of equality indiscriminately to equals and unequals alike”—has made a remarkable comeback. As early as 2003, the journalist Fareed Zakaria argued, “There can be such a thing as too much democracy.” In the years since, many scholars have built this case: The political scientist Larry Bartels painstakingly demonstrated just how irrational ordinary voters are; the political philosopher Jason Brennan turned the premise that irrational or partisan voters are terrible decision makers into a book titled Against Democracy; and Parag Khanna, an inveterate defender of globalization, argued for a technocracy in which many decisions are made by “committees of accountable experts.” Writing near the end of the 2016 primary season, when Trump’s ascent to the Republican nomination already looked unstoppable, Andrew Sullivan offered the most forceful distillation of this line of antidemocratic laments: “Democracies end when they are too democratic,” the headline of his essay announced. “And right now, America is a breeding ground for tyranny.”

The antidemocratic view gets at something real. What makes our political system uniquely legitimate, at least when it functions well, is that it manages to deliver on two key values at once: liberalism (the rule of law) and democracy (the rule of the people). With liberalism now under concerted attack from the Trump administration, which has declared war on independent institutions such as the FBI and has used the president’s pulpit to bully ethnic and religious minorities, it’s perhaps understandable that many thinkers are willing to give up a modicum of democracy to protect the rule of law and the country’s most vulnerable groups.

If only it were that easy. As we saw in 2016, the feeling that power is slipping out of their hands makes citizens more, not less, likely to entrust their fate to a strongman leader who promises to smash the system. And as the examples of Egypt, Thailand, and other countries have demonstrated again and again, a political elite with less and less backing from the people ultimately has to resort to more and more repressive steps to hold on to its power; in the end, any serious attempt to sacrifice democracy in order to safeguard liberty is likely to culminate in an end to the rule of law as well as the rule of the people.

The easy alternative is to lean in the other direction, to call for as much direct democracy as possible. The origins of the people’s displacement, the thinking goes, lie in a cynical power grab by financial and political elites. Large corporations and the superrich advocated independent central banks and business-friendly trade treaties to score big windfalls. Politicians, academics, and journalists favor a technocratic mode of governance because they think they know what’s best and don’t want the people to meddle. All of this selfishness is effectively cloaked in a pro-market ideology propagated by think tanks and research outfits that are funded by rich donors. Since the roots of the current situation are straightforwardly sinister, the solutions to it are equally simple: The people need to reclaim their power—and abolish technocratic institutions.

This antitechnocratic view has currency on both ends of the political spectrum. On the far left, the late political scientist Peter Mair, writing about Europe, lamented the decline in “popular” democracy, which he contrasted with a more top-down “constitutional” democracy. The English sociologist Colin Crouch has argued that even anarchy and violence can serve a useful purpose if they seek to vanquish what he calls “post-democracy.”

The far right puts more emphasis on nationalism, but otherwise agrees with this basic analysis. In the inaugural issue of the journal American Affairs, the self-styled intellectual home of the Trump movement, its founder Julius Krein decried “the existence of a transpartisan elite,” which sustains a pernicious “managerial consensus.” Steve Bannon, the former White House chief strategist, said his chief political objective was to return power to the people and advocated for the “deconstruction of the administrative state.”

Mair and Crouch, Krein and Bannon are right to recognize that the people have less and less hold over the political system, an insight that can point the way to genuine reforms that would make our political system both more democratic and better functioning. One of the reasons well-intentioned politicians are so easily swayed by lobbyists, for example, is that their staffs lack the skills and experience to draft legislation or to understand highly complex policy issues. This could be addressed by boosting the woefully inadequate funding of Congress: If representatives and senators were able to attract—and retain—more knowledgeable and experienced staffers, they might be less tempted to let K Street lobbyists write their bills for them.

Similarly, the rules that currently govern conflicts of interest are far too weak. There is no reason members of Congress should be allowed to lobby for the companies they were supposed to regulate so soon after they step down from office. It is time to jam the revolving door between politics and industry.

Real change will also require an ambitious reform of campaign finance. Because of Citizens United, this is going to be extremely difficult. But the Supreme Court has had a change of heart in the past. As evidence that the current system threatens American democracy keeps piling up, the Court might finally recognize that stricter limits on campaign spending are desperately needed.

For all that the enemies of technocracy get right, though, their view is ultimately as simplistic as the antidemocratic one. The world we now inhabit is extremely complex. We need to monitor hurricanes and inspect power plants, reduce global carbon emissions and contain the spread of nuclear weapons, regulate banks and enforce consumer-safety standards. All of these tasks require a tremendous amount of expertise and a great degree of coordination. It’s unrealistic to think that ordinary voters or even their representatives in Congress might become experts in what makes for a safe power plant, or that the world could find an effective response to climate change without entering cumbersome international agreements. If we simply abolish technocratic institutions, the future for most Americans will look more rather than less dangerous, and less rather than more affluent.

It is true that to recover its citizens’ loyalty, our democracy needs to curb the power of unelected elites who seek only to pad their influence and line their pockets. But it is also true that to protect its citizens’ lives and promote their prosperity, our democracy needs institutions that are, by their nature, deeply elitist. This, to my mind, is the great dilemma that the United States—and other democracies around the world—will have to resolve if they wish to survive in the coming decades.

We don’t need to abolish all technocratic institutions or merely save the ones that exist. We need to build a new set of political institutions that are both more responsive to the views and interests of ordinary people, and better able to solve the immense problems that our society will face in the decades to come.

Writing about the dawn of democracy in his native Italy, the great novelist Giuseppe Tomasi di Lampedusa has Tancredi, a young aristocrat, recognize that he will have to let go of some of his most cherished habits to rescue what is most valuable in the old order: “If everything is to stay the same,” Tancredi says, “everything has to change.” The United States is now at an inflection point of its own. If we rigidly hold on to the status quo, we will lose what is most valuable in the world we know, and find ourselves cast as bit players in the fading age of liberal democracy. Only by embarking on bold and imaginative reform can we recover a democracy worthy of the name.

About admin

I would like to think of myself as a full time traveler. I have been retired since 2006 and in that time have traveled every winter for four to seven months. The months that I am "home", are often also spent on the road, hiking or kayaking. I hope to present a website that describes my travel along with my hiking and sea kayaking experiences.
This entry was posted in Uncategorized. Bookmark the permalink.