House Logo
Explore categories +

Bush vs. Textbooks

Comments Comments (0)

Bush vs. Textbooks

During the final segment of his 1977 interview with Richard Nixon, British TV host David Frost pressed the disgraced 37th president one last time on the issue of his “mistakes.” Nixon’s face appeared twisted and labored as he answered, in part: “I let down my friends. I let down the country. I let down our system of government and the dreams of all those young people that ought to get into government but think that it’s all too corrupt.” The interview, dramatized in Peter Morgan’s 2006 stage play Frost/Nixon and in Ron Howard’s new film adaptation of the same name, shows a man beaten and on the cusp of admitting defeat, if not absolute guilt. Another recent Hollywood film, Oliver Stone’s W., depicts the current president’s answer to a similar question, albeit in a less historically accurate context, when, during a 2004 press conference, Time’s John Dickerson asked George W. Bush what his biggest mistake was following 9/11 and what lessons he had learned from it. Bush couldn’t think of one.

It took the world three years to coax a pseudo-confession from the lips of Tricky Dick, and while it’s unclear what kind of hindsight Bush might be granted in that amount of time, what is apparent is that the level of self-awareness and pathetic self-deprecation portrayed in Frank Langella’s Nixon is absent in Bush and those who have surrounded him during the last eight years. One need look no further than the administration’s Legacy Tour, which sounds more like some geriatric rock act’s nostalgic traveling stage show than an attempt at an overhaul of his political image. The administration has consistently defaulted to as-yet-unborn high school textbook writers to determine whether or not any of their actions were good or bad, but that hasn’t stopped Bush and his cronies from going on a whirlwind publicity tour in an attempt to shape that historical determination.

~

In order to fully comprehend the extent to which the administration fails to comprehend—or the extent to which it willfully obscures—its mistakes, it’s necessary to recognize just how early in Bush’s presidency those mistakes began. I remember being glued to the television in a friend’s dorm room on election night in 2000. It was the first time I had participated in our democracy, and a small group of us stayed up into the wee hours of the morning as, one by one, the networks—led by Fox News, whose Election Analysis Division’s John Ellis called the statistically too-close-to-call Florida, and thus the election, for his cousin George—declared that our new president would not be Al Gore after all. It would be weeks before all the recounts were completed (or not completed, as was the case) and the Supreme Court handed the presidency to the man who, even sans a proper tally, lost the popular vote by over half a million votes.

The electoral college, a system designed over two hundred years ago by founding fathers who believed the office should seek the man and not the other way around, men who still feared British political influence and who aimed to protect the Union from the encroaching powers of the biggest of its then-13 states, was designed at a time when not everyone could see a candidate up close and personal or quickly gain access to copious amounts of information about the men running for public office at the click of a button. Times have changed, though, and the failure of that system eight years ago had consequences far greater than even the biggest cynic could have imagined.

Legitimate or not, Bush’s election was the first profoundly and thoroughly squandered opportunity of his administration. Any other presidential candidate might have been humbled or even embarrassed by the lengths and depths to which he or she had to fight for the office; a more lucid politician might have recognized that a nation divided was not one on which a partisan agenda should be thrust. He or she might have made concessions to the left and reached out in compromise. Instead, Bush defined bipartisanship as the willingness of the opposition to support legislation that bolstered his neoconservative policies.

Bush’s biggest missed opportunity, however, came just a few short months later, when, after ignoring warnings that Islamic extremists were intent on using commercial airliners to attack the United States within its own borders and then they did just that, newspapers across the globe declared, “We Are All Americans!” Out of great tragedy came great opportunity, and for a moment in time, even Democrats rallied around the president. But Bush abused the goodwill he was given and, with the aide of Dick Cheney, Paul Wolfowitz, and the other chief architects of the Iraq invasion, he exploited the events of 9/11 in order to execute a plan that had been in the works for years: removing Saddam Hussein in the quest of creating a larger footprint in the region. The opportunities that the administration saw in the tragedy of the terrorist attacks was not unification or peace but the acquisition of power via the steady and deliberate dismantling of the country’s very founding principles.

Out of great tragedy also comes great responsibility. Bush’s cabinet appointments alone, from Alberto Gonzales (who presided over the most corrupt, ineffective, politicized, and discriminatory Department of Justice in U.S. history) all the way down to the Occupational Safety and Health Administration’s Edwin G. Foulke Jr. (who is, according to R. Jeffrey Smith at The Washington Post, a lawyer and former Bush fundraiser who used to defend companies cited by OSHA for safety and health violations), would tarnish even the most noble of American presidents’ legacies, to say nothing of the appointments he attempted, but failed, to make. But it was Michael Brown, who was appointed as director of FEMA despite having little to no experience, who shouldered much of the blame for the administration’s biggest domestic blunder: the federal response to Hurricane Katrina in 2005. A scapegoat for the administration’s failures, Brown would later claim that he warned Bush of the imminent dangers of a levee breach but that those warnings were dismissed and that the decision about whether or not to federalize the region was viewed as a political opportunity by those close to the president.

 

~

This history, of course, has been so well documented and accepted by the American people, finally, that repeating it here serves merely as context for what is, perhaps, the Bush administration’s most audacious enterprise to date: the rewriting of that history as orchestrated by Karl Rove via a series “exit interviews.” “I think I was unprepared for war,” Bush said when asked last month by ABC News’s Charlie Gibson what he was most unprepared for during his tenure in the White House. It’s a stunning, Nixon-sized admission coming from the man who once proclaimed, “I am a war president. I make decisions here in the Oval Office in foreign policy matters with war on my mind.” When asked if he would have gone to war with Iraq had the intelligence showed that Saddam Hussein did not possess weapons of mass destruction, he said he was unsure.

Rove himself launched his Bush Legacy Project by telling a New York audience that the U.S. would not have invaded Iraq if they knew there was no WMD. But he, like Condoleezza Rice, still stubbornly defends the decision to enter into the elective war, even if the reasons continue to be as disparate as the religious, political, and ethnic factions that comprise Iraq’s population. Rice thinks it was good for America: “[Hussein] was an implacable enemy of the United States,” she reasoned in a recent interview with Tavis Smiley. What’s good for America, then, is evidently good for the world, right? In 2005, at the height of the violence in Iraq, Pentagon advisor Richard Perle told The Pittsburgh Tribune Review that the Bush Doctrine of preemptive war was intended to promote democracy throughout the world: “This doesn’t mean imposing democracy by force. We can’t do that, and we know we can’t do that. But sometimes the obstacles to democracy can only be removed by force.” To quote Michael Knight from his piece “Empire America – Spreading Freedom, Democracy, Terrorism”: “Darling, I would never rape you. I am just tearing your clothes off so we can make love.”

This myopic view of the world is manifest in everyone in and surrounding the administration—no surprise considering that its namesake is seemingly incapable of looking inward or backward. Dick Cheney is, maybe, the only one not involved in some daft attempt at political revisionism, proudly telling ABC’s Jonathan Karl in early December that he did indeed authorize the use of torture, though he refused to use the word, and generously expressed astonishment on behalf of all of us who witnessed the attacks of 9/11 that there hasn’t been another one yet. The implication is, naturally, that the administration is due credit for subsequently preventing an attack like the one it failed to prevent in 2001.

“There can be no debate about the results in keeping America safe,” Bush told the U.S. Army War College, ostensibly the only audience he could find that would be unlikely to call him out on his rhetorical challenge. “We’ll never know how many lives have been saved,” he continued, citing failed attempts to bomb fuel tanks at JFK Airport, a plot to blow up international jets, and a plan to attack a Chicago-area shopping mall—effectively giving himself a hypothetical pat on the back for the hypothetical prevention of attacks that were essentially hypothetical (that is, merely aspirational and not operational). It’s like Osama Bin Laden expressing a desire to bomb Smurf Village, realizing he’s not an animated cartoon character, and then Papa Smurf taking credit for preventing the attack.

For an even flimsier logic than Bush’s, look no further than a recent piece by Peggy Noonan (the title of which, “At Least Bush Kept Us Safe,” speaks volumes in and of itself): “It is unknown, and perhaps can’t be known, whether [the lack of another domestic terrorist attack] was fully due to the government’s efforts, or the luck of the draw, or a combination of luck and effort. And it not only can’t be fully known by the public, it can hardly be fully known by the players at all levels of government. They can’t know, for instance, of a potential terrorist cell that didn’t come together because of their efforts.” (The Wall Street Journal apparently now pays writers to talk in circles.)

Three weeks ago, White House Press Secretary Dana Perino released a statement in response to a New York Times article which placed the blame for the financial meltdown of 2008 squarely on Bush’s soldiers: “The Times’ ’reporting’ in this story amounted to finding selected quotes to support a story the reporters fully intended to write from the onset, while disregarding anything that didn’t fit their point of view,” she said. Ignoring for a moment both the veracity of the Times piece and the thanklessness of Perino’s job, one can’t help but notice the blatant hypocrisy with which the White House statement smacks. It’s reminiscent of Bush’s own countless missives, like his second inaugural speech, which was littered with hypocrisies about the “ideologies that feed hatred,” the “pretensions of tyrants,” and the “force of human freedom,” historical inaccuracies about the founding of the republic, and propaganda that summoned all of the most ignoble parts of our nation’s history. He was the tyrant of which he spoke.

And, at least starting in 2004, he became a demagogue, obtaining power by appealing to the fears of the people and then claiming it was absolute, first by dubbing himself “the decider” and then by laying claim to a “mandate” after winning reelection. “Now it is the urgent requirement of our nation’s security, and the calling of our time,” Bush said during that second inaugural, apparently unaware that his oath of office requires him to “preserve, protect and defend the Constitution of the United States,” not the American people—the pretense under which the administration has waged its wars on sovereign nations and its own citizens’ civil liberties.

Addressing an audience at a Holocaust Museum last month without, miraculously, strapping himself to a board and pouring water down his own throat afterward, Attorney General and latest Bush lapdog Michael Mukasey said: “[L]aw without conscience is no guarantee of freedom; that even the seemingly most advanced of nations can be led down the path of evil.” Agents of the outgoing administration—both major and minor, direct and tangential—appear utterly oblivious to the self-damning hypocrisies that are falling from their mouths in their attempts at salvaging their legacy. In a recent DOJ court filing in which the U.S. is charging the son of former Liberian president Charles Taylor to 147 years in prison for torturing people in his own country, Assistant U.S. Attorney Caroline Heck Miller wrote that torture “undermines respect for and trust in authority, government and a rule of law,” exposing the tragic comedy behind a U.S. court prosecuting torture in other countries while the administration continues to retroactively redefine the word to mask its own crimes. It is the very definition of hubris, the product of a nation whose government has unequivocally become morally, ethically, and intellectually bankrupt on every level and in every branch. There isn’t a textbook big enough to record—nor a cynical political advisor savvy enough to conceal—a legacy as damning as that.

This blog entry was originally published on Slant Magazine on the date above.