Where Do Bad Ideas Come From?

And why don't they go away?

Walt-Steve-foreign-policy-columnist20
Walt-Steve-foreign-policy-columnist20
Stephen M. Walt
By , a columnist at Foreign Policy and the Robert and Renée Belfer professor of international relations at Harvard University.
Illustration by Aaron Goodman for FP
Illustration by Aaron Goodman for FP
Illustration by Aaron Goodman for FP

We would all like to think that humankind is getting smarter and wiser and that our past blunders won't be repeated. Bookshelves are filled with such reassuring pronouncements, from the sage advice offered by Richard Neustadt and Ernest May in Thinking in Time: The Uses of History for Decision Makers to the rosy forecasts of Matt Ridley's The Rational Optimist: How Prosperity Evolves, not to mention Francis Fukuyama's famously premature claim that humanity had reached "the end of history." Encouraging forecasts such as these rest in part on the belief that we can learn the right lessons from the past and cast discredited ideas onto the ash heap of history, where they belong.

We would all like to think that humankind is getting smarter and wiser and that our past blunders won’t be repeated. Bookshelves are filled with such reassuring pronouncements, from the sage advice offered by Richard Neustadt and Ernest May in Thinking in Time: The Uses of History for Decision Makers to the rosy forecasts of Matt Ridley’s The Rational Optimist: How Prosperity Evolves, not to mention Francis Fukuyama’s famously premature claim that humanity had reached "the end of history." Encouraging forecasts such as these rest in part on the belief that we can learn the right lessons from the past and cast discredited ideas onto the ash heap of history, where they belong.

Those who think that humanity is making steady if fitful progress might point to the gradual spread of more representative forms of government, the largely successful campaign to eradicate slavery, the dramatic improvements in public health over the past two centuries, the broad consensus that market systems outperform centrally planned economies, or the growing recognition that action must be taken to address humanity’s impact on the environment. An optimist might also point to the gradual decline in global violence since the Cold War. In each case, one can plausibly argue that human welfare improved as new knowledge challenged and eventually overthrew popular dogmas, including cherished but wrongheaded ideas, from aristocracy to mercantilism, that had been around for centuries.

Yet this sadly turns out to be no universal law: There is no inexorable evolutionary march that replaces our bad, old ideas with smart, new ones. If anything, the story of the last few decades of international relations can just as easily be read as the maddening persistence of dubious thinking. Like crab grass and kudzu, misguided notions are frustratingly resilient, hard to stamp out no matter how much trouble they have caused in the past and no matter how many scholarly studies have undermined their basic claims.

Consider, for example, the infamous "domino theory," kicking around in one form or another since President Dwight D. Eisenhower’s 1954 "falling dominoes" speech. During the Vietnam War, plenty of serious people argued that a U.S. withdrawal from Vietnam would undermine America’s credibility around the world and trigger a wave of pro-Soviet realignments. No significant dominoes fell after U.S. troops withdrew in 1975, however, and it was the Berlin Wall that eventually toppled instead. Various scholars examined the domino theory in detail and found little historical or contemporary evidence to support it.

Although the domino theory seemed to have been dealt a fatal blow in the wake of the Vietnam War, it has re-emerged, phoenix-like, in the current debate over Afghanistan. We are once again being told that if the United States withdraws from Afghanistan before achieving a clear victory, its credibility will be called into question, al Qaeda and Iran will be emboldened, Pakistan could be imperiled, and NATO’s unity and resolve might be fatally compromised. Back in 2008, Secretary of State Condoleezza Rice called Afghanistan an "important test of the credibility of NATO," and President Barack Obama made the same claim in late 2009 when he announced his decision to send 30,000 more troops there. Obama also justified his decision by claiming that a Taliban victory in Afghanistan would spread instability to Pakistan. Despite a dearth of evidence to support these alarmist predictions, it’s almost impossible to quash the fear that a single setback in a strategic backwater will unleash a cascade of falling dominoes.

There are other cases in which the lessons of the past — sadly unlearned — should have been even more obvious because they came in the form of truly devastating catastrophes. Germany’s defeat in World War I, for example, should seemingly have seared into Germans’ collective consciousness the lesson that trying to establish hegemony in Europe was almost certain to lead to disaster. Yet a mere 20 years later, Adolf Hitler led Germany into another world war to achieve that goal, only to suffer an even more devastating defeat.

Similarly, the French experience in Vietnam and Algeria should have taught American leaders to stay out of colonial independence struggles. In fact, French leaders warned Lyndon B. Johnson that the United States would lose in Vietnam, but the U.S. president ignored their advice and plunged into a losing war. The resulting disastrous experience in Vietnam presumably should have taught future presidents not to order the military to do "regime change" and "nation-building" in the developing world. Yet the United States has spent much of the past decade trying to do precisely that in Iraq and Afghanistan, at great cost and with scant success.

Why is it so hard for states to learn from history and, especially, from their own mistakes? And when they do learn, why are some of those lessons so easily forgotten? Moreover, why do discredited ideas come back into fashion when there is no good reason to resurrect them? Clearly, learning the right lessons — and remembering them over time — is a lot harder than it seems. But why?

THE LIMITS OF KNOWLEDGE

For starters, even smart people with good intentions have difficulty learning the right lessons from history because there are relatively few iron laws of foreign policy and the facts about each case are rarely incontrovertible.

And unfortunately, the theories that seek to explain what causes what are relatively crude. When a policy fails, reasonable people often disagree about why success proved elusive. Did the United States lose in Vietnam because the task was inherently too difficult, because it employed the wrong military strategy, or because media coverage undermined support back home? Interpreting an apparent success is no easier: Did violence in Iraq decline in 2007 because of the "surge" of U.S. troops, because al Qaeda affiliates there overplayed their hand, or because ethnic cleansing had created homogeneous neighborhoods that made it harder for Shiites and Sunnis to target each other? The implications for today depend on which of these interpretations you believe, which means that consensus about the "lessons" of these conflicts will be elusive and fragile.

What’s more, even when past failures have discredited a policy, those who want to resurrect it can argue that new knowledge, new technology, or a clever new strategy will allow them to succeed where their predecessors failed. For more than 20 years, for example, a combination of academic economists and influential figures in the finance industry convinced many people that we had overcome the laws of economic gravity — that sophisticated financial models and improved techniques of risk management like financial derivatives allowed governments to relax existing regulations on financial markets. This new knowledge, they argued, permitted a vast expansion of new credit with little risk of financial collapse. They were tragically wrong, of course, but a lot of smart people believed them.

Similarly, the Vietnam War did teach a generation of U.S. leaders to be wary of getting dragged into counterinsurgency wars. That cautious attitude was reflected in the so-called Powell doctrine, which dictated that the United States intervene only when its vital interests were at stake, rely on overwhelming force, and identify a clear exit strategy in advance. Yet after the U.S. military routed the Taliban in 2001, key figures in President George W. Bush’s administration became convinced that the innovative use of special forces, precision munitions, and high-tech information management (together dubbed a "revolution in military affairs") would enable the United States to overthrow enemy governments quickly and cheaply and avoid lengthy occupations, in sharp contrast to past experience. The caution that inspired the Powell doctrine was cast aside, and the result was the war in Iraq, which dragged on for almost eight years, and the war in Afghanistan, where the United States seems mired in an endless occupation.

STRONG BUT FOOLISH STATES

All countries have obvious incentives to learn from past mistakes, but those that have successfully risen to the status of great powers may be less inclined to adapt quickly in the future. When it comes to learning the right lessons, paradoxically, nothing fails like prior success.

This wouldn’t seem to make sense. After all, strong and wealthy states can afford to devote a lot of resources to analyzing important foreign-policy problems. But then again, when states are really powerful, the negative consequences of foolish behavior rarely prove fatal. Just as America’s "Big Three" automakers were so large and dominant they could resist reform and innovation despite ample signs that foreign competition was rapidly overtaking them, strong and wealthy states can keep misguided policies in place and still manage to limp along for many years.

The history of the Soviet Union offers an apt example of this phenomenon. Soviet-style communism was woefully inefficient and brutally inhumane, and its Marxist-Leninist ideology both alarmed the capitalist world and created bitter schisms within the international communist movement. Yet the Soviet Union survived for almost 70 years and was one of the world’s two superpowers for more than four decades. The United States has also suffered serious self-inflicted wounds on the foreign-policy front in recent decades, but the consequences have not been so severe as to compel a broader reassessment of the ideas and strategies that have underpinned many of these mistakes.

The tendency to cling to questionable ideas or failed practices will be particularly strong when a set of policy initiatives is bound up in a great power’s ruling ideology or political culture. Soviet leaders could never quite abandon the idea of world revolution, and defenders of British and French colonialism continued to see it as the "white man’s burden" or "la mission civilisatrice." Today, U.S. leaders remain stubbornly committed to the goals of nation-building and democracy promotion despite their discouraging track record with these endeavors.

Yet because the universal ideals of liberty and democracy are core American principles, it is hard for U.S. leaders to acknowledge that other societies cannot be readily remade in America’s image. Even when U.S. leaders recognize that they cannot create "some sort of Central Asian Valhalla," as Defense Secretary Robert Gates acknowledged in 2009, they continue to spend billions of dollars trying to build democracy in Afghanistan, a largely traditional society that has never had a strong central state, let alone a democratic one.

CLOSED SOCIETIES AND CLOSED MINDS

In theory, democracies like the United States should have a built-in advantage. When governments stifle debate and restrict the public’s access to information, bogus ideas and misguided policies are less likely to be exposed and either corrected or abandoned. In his masterful study of human-induced folly, Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed, James Scott argues that great man-made disasters arise when authoritarian governments pursue radical social transformations that are based on supposedly "scientific" principles such as Marxism-Leninism or Swiss architect Le Corbusier’s "urban modernism." Because such schemes epitomize a certain notion of "progress" and also enhance central control, ambitious political leaders are understandably drawn to them. But because authoritarian regimes routinely suppress dissent, these same leaders may not learn that their ambitious schemes are failing until it is too late to prevent catastrophe. In the same vein, Nobel-winning economist Amartya Sen famously argued that authoritarian regimes are more prone to mass famines because such regimes lack the accountability and feedback mechanisms that give rulers a strong incentive to identify and correct mistakes in a timely manner.

Yet democracies, though they conceal less information and hold leaders more accountable, are hardly immune to similar pathologies. We often assume that open discourse and democratic debate will winnow out foolish policies before they can do much damage, but the "marketplace of ideas" that is supposed to perform that function is far from perfect. In separate accounts of the Bush administration’s successful campaign to sell the invasion of Iraq, political scientist Chaim Kaufmann and New York Times columnist Frank Rich showed how easily democratic leaders can convince skeptical publics to go to war. Bush and his colleagues built support for the invasion by framing options in deliberately biased ways, manipulating a highly deferential media, and exploiting their control over classified information. The result was a nearly nonexistent debate about the wisdom of the war, with the deck heavily stacked in the president’s favor.

The same strategy works to shield leaders from accountability: By concealing information behind walls of secrecy and classification, democratic as well as nondemocratic governments can cover up embarrassing policy failures and make it difficult to learn the right lessons from past mistakes. If citizens and scholars do not know what government officials have done and what the subsequent consequences of their actions are, it is impossible for them to assess whether those hidden policies made sense. To take an obvious current example, it is impossible for outside observers to evaluate the merits of U.S. drone attacks on suspected al Qaeda leaders without detailed information about the actual success rate, including the number of missed targets and innocent civilians killed, as well as the effects of those deaths on terrorist recruitment and anti-American attitudes more generally. When Pentagon officials tell us that increased drone strikes are working, we just have to take them at their word. Maybe they’re right, but if they aren’t, we won’t know until long after the damage has been done.

The same problem can also arise when information is widely available but a subject is considered taboo and thus outside the boundaries of acceptable public discourse. John Mearsheimer and I argue that U.S. policy in the Middle East suffers from this problem because it is nearly impossible for American policymakers and politicians to question Washington’s "special relationship" with Israel or criticize Israeli policy without triggering a hostile reaction, including being smeared as an anti-Semite or a "self-hating Jew." Ironically, making it harder for U.S. officials to tell Israel when its actions are misguided is harmful to Israel; a more open discourse and a more normal relationship would be better for both countries.

A similar taboo seems to be emerging in the realm of civil-military relations. With the United States mired in two lengthy conflicts, American politicians feel a need to constantly reiterate their support for "the troops" and their respect for the generals who run our wars, especially media-savvy commanders like Gen. David Petraeus. Criticizing the military would invite others to question one’s patriotism and therefore is out of bounds. This trend is not healthy because civilians who are overly deferential to the military are unlikely to question military advice, even when it might be bad for the troops as well as the country. But generals are as fallible as the rest of us and should not receive a free pass from their civilian counterparts.

In short, whenever it becomes politically dangerous to challenge prevailing orthodoxies, misplaced policies are more likely to go unquestioned and uncorrected. Wouldn’t it have been better if more well-placed people had objected to the U.S. decision to build massive nuclear overkill (including 30,000-plus nuclear warheads) during the Cold War, questioned the enduring fears of "monolithic communism" and Soviet military superiority, or challenged the wisdom of three decades of financial deregulation? Some did express such qualms, of course, but doing so loudly and persistently was a good way to find oneself excluded from the political mainstream and certainly from the highest corridors of power.

CUI BONO? BAD IDEAS COME FROM SOMEWHERE

Perhaps the most obvious reason why foolish ideas persist is that someone has an interest in defending or promoting them. Although open debate is supposed to weed out dubious ideas and allow facts and logic to guide the policy process, it often doesn’t work that way. Self-interested actors who are deeply committed to a particular agenda can distort the marketplace of ideas.

A case in point is the long-standing U.S. embargo on Cuba, imposed in 1960 with the purpose of toppling Fidel Castro. It is hard to think of a better example of a failed policy that has remained in place for decades despite clear evidence that it is not just a failure, but actively counterproductive. If the embargo were going to bring Castro down, it surely would have happened by now, yet it is kept alive by the political influence of the Cuban-American lobby. Protectionist tariffs and farm subsidies illustrate the same problem. Every undergraduate economics major knows that these programs waste money and reduce overall consumer welfare; yet farmers, factory owners, and labor unions threatened by foreign competition routinely demand subsidies or protection, and they too often receive it. The same thing is true for costly initiatives like ballistic-missile defense, which has been assiduously promoted by aerospace and defense contractors with an obvious interest in getting the Pentagon to fill their coffers at public expense — never mind that it might not actually work.

Even in areas where there is a clear scientific consensus, like climate change, public discourse has been distorted by well-organized campaigns to discredit the evidence and deny that any problem exists. Not surprisingly, those whose economic interests would be hurt if we significantly reduced our reliance on fossil fuels have aggressively funded such campaigns.

In the United States, this problem with self-interested individuals and groups interfering in the policy process appears to be getting worse, in good part because of the growing number of think tanks and "research" organizations linked to special interests.

Organizations like the American Enterprise Institute, the Center for a New American Security, the Washington Institute for Near East Policy, and the Center for American Progress — to name but a few — are not politically neutral institutions, in that their ultimate purpose is to assemble and disseminate arguments that advance a particular worldview or a specific policy agenda. The people who work at these institutions no doubt see themselves as doing serious and objective analysis — and many probably are — but such organizations are unlikely to recruit or retain anyone whose research challenges the organization’s central aims. Their raison d’être, after all, is the promotion of policies favored by their founders and sponsors.

In addition to advocating bad ideas even after they have been found wanting, many of these institutions also make it harder to hold public officials accountable for major policy blunders. For example, one would think that the disastrous war in Iraq would have discredited and sidelined the neoconservatives who dreamed up the idea and promoted it so assiduously. Once out of office, however, they returned to friendly think tanks and other inside-the-Beltway sinecures and resumed their efforts to promote the discredited policies they had favored when they were in government. When a country’s foreign-policy elite is insulated from failure and hardly anyone is held accountable, it will be especially difficult to learn from the past and formulate wiser policies in the future.

The rise of the Internet and blogosphere may have facilitated more open and freewheeling public debate about controversial issues, and websites like YouTube and WikiLeaks have fostered greater transparency and made the marketplace of ideas somewhat more efficient. In the blogosphere, at least, it is no longer taboo to talk critically about the "special relationship" with Israel, even if politicians and mainstream media figures remain reticent.

Nevertheless, there is a downside to these encouraging developments. The proliferation of websites and cable news outlets encourages some people to consume only the news and analysis that reinforce their existing views. Thus, a 2010 survey by the Pew Research Center for the People and the Press found that 80 percent of those who regularly listen to radio host Rush Limbaugh or watch Fox News’s Sean Hannity are conservatives, even though conservatives are only 36 percent of the U.S. population. Similarly, the audience for MSNBC’s Keith Olbermann and Rachel Maddow has nearly twice the fraction of liberals as the general public.

Moreover, competition between a growing number of news outlets seems to be fostering a media environment in which reasoned discourse matters less than entertainment value. Anyone who thinks that major issues of public policy should be dealt with on the basis of logic and evidence cannot help but be alarmed by the growing prominence of Glenn Beck and the know-nothing defiance of the Tea Party.

THE UNITED STATES OF AMNESIA

Last but not least, discredited ideas sometimes come back to life because societies simply forget important lessons about the past. Political psychologists generally agree that personal experiences have a disproportionate impact on our political beliefs, and lessons learned by older generations rarely resonate as strongly with their successors. And besides, as the years go by it becomes easier to argue that circumstances have changed and that "things are different now," encouraging the wrong-headed view that previous wisdoms about how to deal with particular problems might no longer hold. Of course, sometimes those arguments will be correct — there are few timeless verities in political life — and even seemingly unassailable truths might someday be seriously challenged if not discredited. All this just further complicates the problem of learning and retaining the right lessons from the past.

Regrettably, there is no hope of ever making the learning process work smoothly and flawlessly — which is all the more reason why we have little choice but to be wary of firmly entrenched conventional wisdoms, wherever they’re from, and relentlessly question our own judgments about the past as well.

For it just might turn out that a radically different version of events is the correct one, closer to the truth than our present reading of the past. Vigorous, unfettered, yet civil debate remains the most reliable mechanism for acquiring greater wisdom for the future. In the long run we are all dead, as John Maynard Keynes memorably quipped, but humanity could at least get something out of it.

Stephen M. Walt is a columnist at Foreign Policy and the Robert and Renée Belfer professor of international relations at Harvard University. Twitter: @stephenwalt

More from Foreign Policy

Palestinian President Mahmoud Abbas, Jordan's King Abdullah II, and Egyptian President Abdel Fattah al-Sisi talk to delegates during the Arab League's Summit for Jerusalem in Cairo, on Feb. 12, 2023.
Palestinian President Mahmoud Abbas, Jordan's King Abdullah II, and Egyptian President Abdel Fattah al-Sisi talk to delegates during the Arab League's Summit for Jerusalem in Cairo, on Feb. 12, 2023.

Arab Countries Have Israel’s Back—for Their Own Sake

Last weekend’s security cooperation in the Middle East doesn’t indicate a new future for the region.

A new floating production, storage, and offloading vessel is under construction at a shipyard in Nantong, China, on April 17, 2023.
A new floating production, storage, and offloading vessel is under construction at a shipyard in Nantong, China, on April 17, 2023.

Forget About Chips—China Is Coming for Ships

Beijing’s grab for hegemony in a critical sector follows a familiar playbook.

A woman wearing a dress with floral details and loose sleeves looks straight ahead. She is flanked by flags and statues of large cats in the background.
A woman wearing a dress with floral details and loose sleeves looks straight ahead. She is flanked by flags and statues of large cats in the background.

‘The Regime’ Misunderstands Autocracy

HBO’s new miniseries displays an undeniably American nonchalance toward power.

Nigeriens gather to protest against the U.S. military presence, in Niamey, Niger, on April 13.
Nigeriens gather to protest against the U.S. military presence, in Niamey, Niger, on April 13.

Washington’s Failed Africa Policy Needs a Reset

Instead of trying to put out security fires, U.S. policy should focus on governance and growth.