Bush thought his father lacked a grand doctrine. His greatest failures have come from trying to craft one.
There’s some support for the dynastic reading that George W. Bush intended to invade Iraq from the outset of his presidency to avenge his father. “After all, this is a guy that tried to kill my dad at one time,” Bush declared at a political fund-raiser in Houston in September 2002. Considerable doubt has since arisen around the incident Bush was referring to, a supposed plot by Saddam to blow up the former president with a car bomb on a visit to Kuwait in 1993. But there’s little doubt that Bush himself believed what intelligence officials told the family after that incident: that Saddam planned to murder not just George W.’s father, but the other family members visiting Kuwait with him: his mother, Barbara, his wife, Laura, and his two youngest brothers, Neil and Marvin. The incident cast a long shadow in the family. According to family intimates, the Bushes felt they were at risk so long as Saddam remained in power.
Yet of the top-level players in the administration, only Paul Wolfowitz directly advocated military action against Iraq before September 11. From the collective perspective of Bush’s foreign-policy team, Iraq fell into the category of big problems that weren’t urgent. His people were instinctually critical of Clinton’s proportionate responses to Saddam’s provocations and felt they might have to act more decisively at some point in the future. But the same category of problem also included North Korea and Pakistan’s nuclear programs, Russia’s growing authoritarianism, and China’s belligerence toward Taiwan. There were no preparations or significant planning for war in Iraq until September 2002 and no point-of-no-return buildup until January 2003.
In other words, George W. Bush did not arrive in the White House determined to invade Iraq. So why did he ultimately decide to do it? Bush’s struggle to vindicate his family and outdo his father predisposed him toward completing a job his dad left unfinished. But it was his broader attempt to develop a foreign policy different from his father’s that led him into his biggest mistake. Act One of the Bush Tragedy is the son’s struggle to be like his dad until the age of forty. Act Two is his growing success over the next fifteen years as he learned to be different. The botched search for a doctrine to clarify world affairs and the president’s progressive descent into messianism constitute the conclusive third act.
Bush Doctrine 1.0 was Unipolar Realism (3/7/99–9/10/01). Driven more by the refutation of Clinton’s liberal internationalism than of 41’s diplomatic realism, it challenged his father’s worldview only obliquely. Bush steered clear of his father’s men, former Secretary of State James Baker and former National Security Adviser Brent Scowcroft, but was advised by Colin Powell and Scowcroft’s protégée Condoleezza Rice, who was likewise grounded in classic “balance of power” realism.
He was a realist with a different list of things to do, a harder shell, and less use for the “smiles and scowls of diplomacy.” In his first eight months he showed how much less. Bush declared his intention to abrogate the ABM Treaty and move ahead with developing missile defense. Where his father was a Sinophile, the son saw a growing military threat. He talked tough when the Chinese forced down a U.S. military plane violating their airspace and held its crew hostage. He spoke ambiguously about whether he supported continuing the long-standing policy of “strategic ambiguity” with respect to Taiwan. He repudiated the Kyoto Accords on global warming. He spurned Yasir Arafat and stood by Ariel Sharon in Israel. He broke off negotiations with North Korea.
Unipolar realism survived its initial encounters with reality, but not with September 11. By the end of that day, the president had a new approach. Bush Doctrine 2.0 was With Us or Against Us (9/11/01–5/31/02). The new doctrine didn’t represent a repudiation of the first one so much as an elaboration of it to deal with the previously neglected problem of terrorism. It provided the justification for not just pursuing Al Qaeda, but for deposing the Taliban, its host in Afghanistan. If Rice first came up with the “no distinction” idea, it was [Vice President Dick] Cheney who first started calling it the “Bush Doctrine” in public. In a November 2001 speech to the U.S. Chamber of Commerce, Cheney offered this definition: “We will hold those who harbor terrorists, those who provide sanctuary to terrorists, responsible for their acts.”
But by the time Cheney spoke those words, a second wave of terrorism had already exposed the inadequacy of Doctrine 2.0. The anthrax attacks in New York and Washington created a sense of vulnerability that was in many respects greater than the mass murder at the World Trade Center and Pentagon. Inside the administration, the October bioterror attacks had a larger impact than is generally appreciated—one in many ways bigger than 9/11. Without the anthrax attacks, Bush probably would not have invaded Iraq.
At that point, nearly everyone involved in national security assumed there would be another wave of terrorist attacks. The daily intelligence summary substantiated this panic; “chatter” was at record levels. In an effort to understand the potential threat, Cheney’s chief of staff I. Lewis “Scooter” Libby ordered up a briefing on a war game, known as “Dark Winter,” which modeled a smallpox outbreak in an American city in much the way “continuity of government” exercises Cheney had participated in during the 1980s simulated nuclear catastrophe.
According to a source close to Bush, Cheney swiftly reported back to the Oval Office with a sobering message: the United States was essentially defenseless against the most likely form of assault, a biological attack. “I sat through the most gruesome briefing in the Oval Office about anthrax, how it could spread, and how we had no defenses,” Bush’s first press secretary, Ari Fleischer, told me in the summer of 2007. “Dick Cheney was the strongest advocate of the possibility of attack and need to prepare for it.”
Then on October 4 the worst fears inside the White House were realized. Bush choked up as he thanked government workers in a morning speech at the State Department. Ari Fleischer reports that he had “never before and never since seen the president look as tired and as troubled as he did that morning.” When they returned to the White House, Bush called Fleischer into his office and explained the reason: he had just learned that a Florida man had been stricken with anthrax. Bush feared it was the dreaded second wave.
Another anthrax letter, never recovered (or at least never disclosed), was apparently sent to the White House. On October 22, anthrax was found on an automated slitter used to open letters at a Secret Service facility in an undisclosed location some miles away. This meant the White House was a target of biological terrorism. “I think the seminal event of the Bush administration was the anthrax attacks,” someone close to the president told me. “It was the thing that changed everything. It was the hard stare into the abyss.”
Cheney and Libby began spending time at the Health and Human Services department, which was leading the confused response to the anthrax attacks and making preparations for the possibility of something much worse. The greatest fear of officials there was an attack involving smallpox. The smallpox virus killed an estimated 300 million people in the 20th century. It was still taking 2 million lives a year as late as 1967, when the World Health Organization began the massive campaign that wiped out the disease a decade later. After smallpox was eradicated in 1977, only the United States and Russia were permitted to retain research samples of the virus, under closely monitored, secure conditions. But an intelligence review ordered by Cheney determined that Iraq, North Korea, and Russia were all likely to possess undeclared stocks.
Cheney and Libby believed that Iraq’s potential to produce a smallpox weapon necessitated universal vaccination of the general population, something that hadn’t happened in the United States since 1972. On the other side of the argument was Donald Henderson, the heroic epidemiologist who led the WHO smallpox eradication program and later became Bush 41’s science adviser. After the anthrax attacks, HHS brought Henderson in as a consultant to help develop emergency plans.
When I visited him at his office at the Center for Biosecurity in Baltimore, Henderson recounted a surprise, unpublicized visit he paid to the Centers for Disease Control in Atlanta with Cheney and Libby on July 18, 2002. Henderson flew down with them on Air Force Two and spent most of the trip explaining to the vice president and his chief of staff why he and other epidemiologists thought a massive vaccination program would be a terrible idea. Even medical professionals were horrified when they saw the range of normal reactions to a vaccination: grotesque scabs, lesions, and pustules. Henderson showed me a pamphlet that HHS distributed to hospitals to document the abnormal reactions: blackened limbs, uncontrolled swelling, and a reaction called progressive vaccinia, in which sores cover the body from head to toe.
Worse than the panic these reactions would cause would be the predictable casualties. According to Henderson, adverse reactions to the vaccine were estimated to kill between one and two out of every million people inoculated. The question of legal liability would be a nightmare. Henderson said that Cheney and Libby didn’t seem to disagree with his arguments, which he reviewed with them on the return flight. “I thought, Thank God they’ve finally gotten the message. Finally we’ve been able to get it through to them that this just does not make sense,” Henderson said.
When he reached his home in Baltimore two hours later, Henderson’s wife was waiting with an urgent message to call the office. “They were going to have a press release the next morning announcing that they were going to vaccinate the entire country immediately,” Henderson said. “I couldn’t believe it.” But after girding for battle and taking a 5:00 a.m. train to HHS the next morning, Henderson was relieved to be told that the vaccination plan was off after all. Bush had overruled Cheney. Bush eventually announced a compromise: mandatory vaccination of 500,000 military personnel, and voluntary vaccination for the same number of health-care workers or “first responders.” But by the time the vaccine was ready for use, in early 2004, the panic was over. Saddam didn’t have a smallpox weapon after all. Bush was vaccinated at the White House, but decided that members of his family and the White House staff didn’t need to run the risk. Cheney himself chose not to be vaccinated.
Those who believe the vice president operates in bad faith—that he concocted evidence of Iraqi WMD to justify a war—should consider his stance on universal smallpox vaccination. By most estimates, even a safe vaccine would have killed a few hundred Americans and made thousands seriously ill. Cheney’s readiness to sacrifice hundreds of civilian lives may make him sound like Dr. Strangelove. But if the idea was mad, it was sincerely mad, testifying to how seriously he took the possibility that Saddam had biological weapons and might use them, or give them to terrorists to use, against the United States.
Cheney accepted without reservations that Saddam was a “state sponsor of terrorism.” Libby and Deputy Secretary of Defense Paul Wolfowitz had long been interested in their friend Laurie Mylroie’s unified field theory of terrorism. Mylroie argued that Saddam was behind every major terrorist attack against Americans in the 1990s, including the first attack on the World Trade Center in 1993 and the Oklahoma City bombing in 1995. Mylroie’s book “Study of Revenge: Saddam Hussein’s Unfinished War Against America” was published by the American Enterprise Institute, where she was a fellow. On the back cover are glowing blurbs from Libby, Wolfowitz, and Richard Perle. Cheney followed these men into the tortured pathways of Mylroie’s conspiracy theory, including her seizing on reports that the 9/11 ringleader Mohammed Atta had met with Iraqi intelligence officers in Prague.
In another administration, there would have been various checks on this kind of collective delusion. A Kennedy, a Nixon, a Clinton, and a George H. W. Bush all would have considered evidence to some degree. But once Bush’s mind was made up that Saddam was building biological and nuclear weapons, it closed to alternative explanations. He thought picking through evidence was beneath him. In 43’s White House, as his communications director Dan Bartlett put it in an anonymous background briefing, “The President of the United States is not a fact checker.” If the Director of the CIA told him the case for Saddam’s WMD was a “slam dunk,” that was all Bush needed to hear.
The problem with the earlier idea of With Us or Against Us was that it didn’t promulgate any strategy for protecting the United States in an age of biotechnology, miniaturization, nonstate actors, and porous borders. To make the country more secure, we’d have to find a way of cutting off these threats at the root, not just by taking on hosts, but by disabling known and potential WMD proliferators. This was Bush Doctrine 3.0, Preemption (6/1/02–11/5/03). Where Doctrine 2.0 justified the war in Afghanistan, which was harboring Al Qaeda, Doctrine 3.0 would provide a basis for invading Iraq, which might assist Al Qaeda in the future.
The neoconservatives had a different motivation for going to war with Iraq. They were less focused on preventing what Saddam might do to the United States than on what getting rid of him could do for the United States. The neocons thought pulling the plug on his toxic regime would transform the sick political culture of the Arab Middle East.
Many neocons believed that turning secularized Iraq into a third pro-Western democracy in the region would cause other authoritarian regimes to topple. As it liberalized, the Middle East would cease to provide a breeding ground for terrorism. Arabs would also come to accept the presence of Israel, something the mostly Jewish neoconservatives cared about especially. Wolfowitz has often been described as the “architect” of war in Iraq. The war could have used an architect—someone responsible for planning what would happen during the occupation. In reality, he was more like the war’s theologian, coming up with a variety of theorems, arguments, and justifications for his abiding faith that the political nature of the Arab world could be transformed from without.
Wolfowitz and his protégé Scooter Libby, the other most influential neoconservative inside the administration, were driven by a particular notion about how to transform the sick political culture of the Middle East. The big thinker behind their theory was the Arab scholar Bernard Lewis, a professor emeritus at Princeton. The originator of the phrase “the clash of civilizations,” Lewis believed Muslims had been engaged in a “cosmic struggle for world domination” since the time of Muhammad. Centuries of defeat, subjugation, and misrule, to which the United States contributed by supporting corrupt and incompetent dictators, prepared the way for Islamist terrorism. Cheney met Lewis when he was Secretary of Defense, and the two became friends. After September 11, he became interested in Lewis’s argument about what had gone wrong in the Arab world.
Over a series of lunches at the vice president’s residence in 2002, Lewis laid out his case for using American military power to change the regime in Iraq. Years of “anxious propitiation” had left the Muslim world convinced of our weakness. Force was what Arabs respected. A conclusive show of strength could catalyze a change in the opposite direction. The neoconservatives have a weakness for historical analogies—and for one analogy in particular. “Anxious propitiation” was a fancy name for appeasement, compromising with an enemy that needed confronting. In this analogy, Saddam was Hitler, who grew in strength as the West postponed challenging him. Or, if not Nazi Germany, Iraq was a Soviet-style totalitarian state, vulnerable to a combination of American moral and military pressure.
By mid-2002, Cheney had become a down-the-line ally of the neoconservatives. But that does not mean he had turned into some sort of democratic idealist. He never cited Bernard Lewis’s theory in any of his public advocacy for the war. For the congenitally pessimistic vice president, transforming the political culture of the Middle East can’t have been more than a castle in the sky, a long-shot best-case scenario. But the vice president surely recognized that the grandiosity of the neocon vision of a new Arab world would resonate with the president. For Bush, boldness had a constant allure. Remaking the Middle East via Iraq was just the kind of game-changing idea he went for.
After the invasion, as the WMD mirage melted away, Bush’s retrospective case for the war shifted, and his theory of foreign policy along with it. Bush Doctrine 4.0 became Democracy in the Middle East (11/6/03–1/19/05). Bush’s November 6, 2003, speech at the National Endowment for Democracy framed a new theory of international relations around the way he now hoped to justify his war. The United States, he announced, “has adopted a new policy,” which he described as “a forward strategy for freedom in the Middle East.” Bush argued that excusing and accommodating tyranny over the previous sixty years hadn’t made Americans safe “because in the long run stability cannot be purchased at the expense of liberty.” Stability was one of Scowcroft’s watchwords. Bush called liberty “the design of nature” and “the direction of history.”
Here finally was the grand vision Bush had been looking for. Democratizing the Arab world was a clear, moral goal, the ambitious work of a consequential presidency. Like compassionate conservatism, it was a form of social evangelism, a mission inspired by faith but secular in application. Bush’s new formulation had the added advantage of extending the term of evaluation. If we were witnessing what Rice called “the birth pangs of a new Middle East,” the first report card wouldn’t be in for some time.
But Bush’s stirring words underscored the difficulty with his ever-changing foreign policy. The problem wasn’t that he wanted to spread democracy and human rights—a goal that in other contexts unites liberal hawks and doves with many conservatives—but his relentless ebb into abstraction, incompetent execution, and glaring inconsistency. Had he been someone capable of acknowledging error, Bush’s misjudgment in invading Iraq might have been mitigated by skillful improvisation. How might such a person have reacted? He would have told his Secretary of Defense that the spectacle of looters stripping government buildings down to their concrete skeletons wasn’t the kind of untidy freedom the United States could tolerate. As the Pentagon failed to create viable structures, he might have shifted control to the State Department and devolved power to the United Nations, instead of trying to fend it off.
He could have acknowledged the emergence of an insurgency, and adopted a different strategy to combat it before 2007. He should have blocked, reversed, or at least understood the significance of Paul Bremer’s two first and most disastrous orders, to disband the Iraqi army and bar those with Ba’ath Party connections from serving in the government. (Bush later told author Robert Draper that disbanding the army wasn’t his policy, and that he wasn’t sure why it had happened.) He would have fired Rumsfeld after Abu Ghraib, if not sooner. He would have taken steps to dismantle the echo chamber around him, instead of adding layers of insulation. None of that would have ensured a better outcome, but it surely would have diminished the harm from his original mistake.
Why couldn’t Bush respond in a more supple fashion, even after his reelection? Partly, his inability to adjust reflects his limitations as an executive. Despite his MBA training, Bush emphasizes leadership and decision-making to the exclusion of administration and management. He delegates manfully, but doesn’t solicit feedback, evaluate results, or hold people accountable, except in extraordinary circumstances. Unlike his father, he isn’t comfortable entertaining inconclusive debate. Bush sees reconsidering decisions or openly changing course as evidence of weak leadership. This stubbornness was born of a success that came from not giving in to his parents’ doubts about him and not listening to their advice.
At a temperamental level, the president has almost no ability to accept blame or learn from mistakes. Disagreement, whether from critics or allies, sounds like his mother’s nagging and his father’s disappointment. Thus criticism has the opposite of its intended effect on him. Disapproval hardens Bush’s conviction that he must be right and reinforces his refusal to surrender. Believing he earned his position in life through willpower, he feels he shouldn’t have to ask anyone for permission. This obstinacy has been evident in his personnel practices as well as policy choices. The more the media demanded Bush yield up a head—CIA Director George Tenet, Rumsfeld, Karl Rove, Attorney General Alberto Gonzales—the longer that person was likely to be staying around.
Bush’s inflexibility is rooted in the old family drama. It reflects not just a personality forged in opposition to his father, but an idea of leadership developed in conscious contrast to him. Where George H. W. Bush weighed options, W. sizes you up and decides. Where 41 saw shades of gray, 43 finds moral clarity. “The son prides himself on being the guy who cuts through it all, who is decisive, not wishy-washy,” Brent Scowcroft told me in November 2007. “The subtleties, partly because of his inexperience, don’t seem to matter as much. His father, with the background he has, knows that at best you’re operating forty-nine/fifty-one—and you’d better be sure that the fifty-one is on your side and not the forty-nine.”
Bush makes a point of saying, whenever it comes up, that he doesn’t get advice from his father about the conduct of the war. Judging from his father’s roundabout efforts to influence him, this seems likely to be true. In his book, “Rumsfeld: His Rise, Fall, and Catastrophic Legacy,” Andrew Cockburn reports a visit 43 paid to Kennebunkport during the summer of 2004. His father gave him a memo that Scowcroft had asked him to pass along about Iraq. The president glanced at it before throwing it aside, telling his dad, “I’m sick and tired of getting papers from Brent Scowcroft telling me what to do, and I never want to see another one again.” With that, 43 stalked out of the room and slammed the door behind him.
The collapse of his preemption justification for the war (terrorism + WMD = intolerable threat) sent Bush not into any reexamination of his decision, but toward grander and grander justification. Shortly before the election in 2004, Bush’s friend and former Texas Rangers partner Tom Bernstein gave him the galley proofs of “The Case for Democracy” by the former Soviet refusenik and right-wing Israeli politician Natan Sharansky. Sharansky’s book portrays Bush in a heroic light, comparing the war against terrorism to the struggle against the Nazis and the Soviets. Sharansky draws a contrast to Bush’s father’s “notorious” Chicken Kiev speech in 1991 telling the Ukrainians to avoid “suicidal nationalism,” which he calls “an unmitigated disaster.”
Sharanskyism, the exfoliated version of the Freedom Agenda, became Bush Doctrine 5.0, Freedom Everywhere (1/20/05– 11/7/06). Bush unveiled his newest foreign policy in his second inaugural address, which announced the goal of abolishing oppression on planet Earth: “It is the policy of the United States to seek and support the growth of democratic movements and institutions in every nation and culture, with the ultimate goal of ending tyranny in our world.” Democracy is God’s gift to humanity, Bush declared, and the United States would help extend its blessings.
It is hard to believe that anyone other than Bush and his speechwriters, who seemed increasingly to be making his foreign policy, thought about the issue of democracy promotion in such shallow, utopian terms. Though his inaugural address sounded religious, there is no theological basis for democracy as God’s chosen system of government. The Old Testament favors monarchy, the New Testament, a kind of socialism. It was as if Bush now simply identified his democratic crusade with the will of God.
Bush Doctrine 5.0 flopped in practice faster than any of its predecessors. Within a year, no one in the administration other than Rice wanted to talk about the Freedom Agenda. This idea did the impossible: it caused Dick Cheney and the State Department bureaucracy to agree about something, namely that the president’s policy was a pipe dream. The dissonance between Bush’s message and his cavalier attitude toward civil liberties discredited him as a moral messenger. While pressing for divinely ordained liberty in the Middle East, Bush was still taking Dick Cheney’s advice on keeping Guantánamo open, allowing torture, and listening in on phone conversations by American citizens. Thus did Bush’s universal call for democracy not only become an exercise in futility but in many places actually proved counterproductive. From Russia to Venezuela, associating democratic opponents with Bush’s foreign policy became a pretext for taking rights away. In Iran, the Nobel Peace Prize– winning human-rights lawyer Shirin Ebadi complained that Bush’s advocacy was setting her cause back. Thus did the fifth Bush Doctrine recede into what the president called, in a phrase from his second inaugural, the “work of generations.”
Bush’s final foreign policy (11/8/06 to date) was the absence of any functioning doctrine at all. After the Republican loss of both houses of Congress, his administration cobbled together an enfeebled hybrid based on the collapse of the previous five: a retreat from unipolarity, a moratorium on the application of preemption (though bombing Iran remained under discussion), and a tacit consensus to regard the Freedom Agenda as presidential hot air. Bush and his speechwriters have not acknowledged his final doctrine’s demise. He has said that he will make democracy promotion his major post-presidential project, and that he intends to set up a freedom institute as part of the presidential library to be built at Southern Methodist University in Dallas.
The final irony of Bush’s foreign-policy crackup was the way it vindicated his father’s choices. Not “finishing the job” and taking ownership of Iraq in 1991 now looked like an act of wisdom. Not making a triumphal speech when the Berlin Wall came down appeared as shrewd management of a dicey situation, which advanced the practical cause of freedom more than a provocative speech would have. Appreciating the value of stability sounded like maturity. Avoiding needlessly bellicose rhetoric seemed like common sense. As the historian Timothy Naftali writes in his generally admiring 2007 biography of George H. W. Bush, “As the younger Bush’s own presidency limped to an end, many missed the elder Bush’s realism, his diplomacy, his political modesty, and, yes, even his prudence.” The more the son’s faults glared, the more his father’s reputation grew.