|

Rethinking American History
in a Post-9/11 World
from History News Network -- September 6 2004
In 1948, Roy F. Nichols, a distinguished scholar of the Civil War era, published
a short essay about the Second World War’s likely impact on American
historians. Nichols predicted a sweeping “reorientation of historical
thinking.” “Any great disturbance in the world of action or
intellect,” he wrote, “produces very noticeable effects upon
the methods and controlling thought patterns of historians. It is probable
that the recent war will prove no exception.”
We have recently lived through our own “great disturbance.”
September 11 was not -- at least, not yet -- as transformative an event
as World War II. Yet it undoubtedly will lead historians to rethink how
we study and teach the American past. This, indeed, is as it should be.
All history, the saying goes, is contemporary history. The past forty
years have demonstrated how people instinctively turn to the past to help
understand the present and how events draw our attention to previously
neglected historical subjects. The “second wave” of feminism
gave birth to a flourishing subfield of women’s history. The Reagan
Revolution inspired a cottage industry in the history of American conservatism.
These and other such developments have enriched our understanding of American
history and expanded the cast of characters who occupy the historical
stage.
The owl of Minerva takes flight at dusk -- historians, that is, prefer
to wait until events have concluded before subjecting them to historical
analysis. Nichols’s essay itself demonstrates that it is difficult
while caught up in momentous events to predict how they will shape historical
understanding. He anticipated that the uncertainties and anxieties produced
by World War II, compounded by the nuclear sword of Damocles it left suspended
over the collective head of mankind, would lead American historians to
abandon their traditional “optimism” in favor of a stance
of “wary disillusionment.” Quite the opposite, in fact, transpired.
As the Cold War came to dominate the country’s thought and culture,
leading historians were drawn to an account of our past that celebrated
American “exceptionalism” and downplayed instances of inequality
and social conflict in the nation’s history.
Historians are still uncertain how September 11 will affect their craft.
The clearest blueprint for new directions in historical education, indeed,
have come from outside the academy, in a spate of statements by conservative
commentators. In a speech less than a month after the tragedy, Lynne Cheney,
wife of the vice president and former head of the National Endowment for
the Humanities, insisted that calls for more intensive study of the rest
of the world amounted to blaming America’s “failure to understand
Islam” for the attack on the World Trade Center and Pentagon. A
letter distributed by the American Council of Trustees and Alumni, which
she once chaired, chastised professors who fail to teach the “truth”
that civilization itself “is best exemplified in the West and indeed
in America.”
Then, Dinesh D’Souza weighed in with What’s So Great About
America (note that there is no question mark in the title), a book that
sought to rally the American people by contending that principles like
freedom and religious toleration are uniquely “Western” beliefs.
For D’Souza, the only reason to study other parts of the world is
to point out our superiority to them. The publisher’s ad for his
book identified those who hold alternative views as “people who
provide a rationale for terrorism.” William Bennett, in his book
Why We Fight, claimed that scholars with whom he disagrees “sow
widespread and debilitating confusion” and “weaken the country’s
resolve.”
Like all momentous events, September 11 is a remarkable teaching opportunity.
But only if we use it to open rather than to close debate. Critical intellectual
analysis is our responsibility -- to ourselves and to our students. Explanation
is not a justification for murder, criticism is not equivalent to treason,
and offering a historical analysis of evil is not the same thing as consorting
with evil.
The philosopher Friedrich Nietzsche identified three approaches to history
– the monumental, antiquarian, and critical. Recent calls to narrow
the range of acceptable discussion to what Nietzsche called monumental
or celebratory history, themselves have a long lineage. In every country,
versions of the past provide the raw material for nationalist and patriotic
sentiments. In this country, such calls have mounted at times of nation-building
(such as the first half of the nineteenth century), perceived national
fragmentation (such as the 1890s and 1990s, both decades of widespread
concern over mass immigration and cultural disunity), and during wars.
In World War I, distinguished scholars produced pamphlets to government
specifications explaining, for example, the “common principles”
shared by Jean-Jacques Rousseau, Oliver Cromwell, and Thomas Jefferson,
to illustrate the historical basis of the Franco-British-American alliance.
During the Cold War leading historians celebrated the solution of major
social problems, the “end of ideology” and the triumph of
a liberal “consensus” in which all Americans, except malcontents
and fanatics, shared the same mainstream values.
Walter Lippmann once wrote that the function of good journalism is to
ensure that people are not surprised. The same can be said of good history.
The past historians portray must be one out of which the present can plausibly
have grown. The problem with the consensus history of the 1950s, for example,
was not simply that it was incomplete but that it left students utterly
unprepared to understand American reality. The civil rights revolution,
divisions over Vietnam, Watergate – these seemed to spring from
nowhere, without discernible roots in the American past. The self-absorbed,
super-celebratory history promoted in the aftermath of September 11 --
a history lacking in nuance and complexity -- will not enable students
to make sense of our increasingly interconnected world. We need a historical
framework that eschews pronouncements about our own superiority and prompts
greater self-consciousness among Americans and greater knowledge of those
arrayed against us.
Of course, there is nothing inherently wrong with young people taking
pride in their nation’s accomplishments. Lippmann’s point,
however, is that the role of the journalist or the historian is neither
to celebrate nor to condemn but to explain. September 11 rudely placed
certain issues on the historical agenda. Let me consider briefly three
of them and their implications for how we think about the American past:
the invocation of freedom as an all-purpose explanation for the attacks
and a justification for the ensuing war on terrorism and invasion of Iraq;
widespread acquiescence in significant infringements on civil liberties;
and a sudden awareness of considerable distrust abroad of American actions
and motives. The first step in thinking about these “surprises”
is to historicize them -- to understand that they all have histories.
No idea is more quintessentially American than freedom. And throughout
our history, in moments of crisis, the question of freedom -- what it
is, why it is worth defending, who should enjoy it -- seems to come to
the fore. Many commentators, nonetheless, were surprised by how quickly,
in the aftermath of September 11, freedom became an all-purpose explanation
for both the attack and the ensuing war against “terrorism.”
“Freedom itself is under attack,” President Bush announced
in his speech to Congress of September 21, and he gave the title Enduring
Freedom to the war in Afghanistan. Our antagonists, he went on, “hate
our freedoms, our freedom of religion, our freedom of speech, our freedom
to assemble and disagree with each other.” A year later, in calling
for increased attention to the teaching of American history so that schoolchildren
can understand “why we fight,” Bush observed, “ours
is a history of freedom, ... freedom for everybody.”
The 2002 National Security Strategy, the document that announced the
doctrine of preemptive war, opens not with a discussion of global politics
but with an invocation of freedom, defined as political democracy, freedom
of expression, religious toleration, and free enterprise.” These,
the document proclaims, “are right and true for every person, in
every society.” There is no sense that this constellation of values
is the product of a particular moment and a specific historical experience,
or that other people might have given thought to the question of freedom
and arrived at somewhat different definitions. Naturally, the invasion
of Iraq was called name Operation Iraqi Freedom. And in April 2004, in
explaining the continuing resistance to the occupation, the president
declared: “We love freedom and they hate freedom – that’s
where the clash occurs.” Freedom, he added, was not simply an American
idea; “it is God’s gift to the world.”
There is nothing unusual in the invocation of freedom as an American
rallying cry, or in the idea that American policymakers are implementing
God’s will. The Revolution gave birth to a definition of American
nationhood and national mission that persists to this day, in which the
new nation defined itself as a unique embodiment of liberty in a world
overrun with oppression. The Civil War and emancipation reinforced the
identification of the United States with the progress of freedom. In the
twentieth century, the discourse of a world sharply divided into opposing
camps, one representing freedom and the other its antithesis, was reinvigorated
in the worldwide struggles against Nazism and communism. The sense of
American uniqueness, of the United States as an example to the rest of
the world of the superiority of free institutions, remains very much alive
as a central element of our political culture.
As I suggested in The Story of American Freedom, a book published in
1998, groups from the abolitionists to modern-day conservatives have realized
that to "capture" a word like freedom is to acquire a formidable
position of strength in political conflicts. Freedom is the trump card
of political discourse, invoked as often to silence debate as to invigorate
it. The very ubiquity today of the language of freedom suggests that we
need to equip students to understand the many meanings freedom has had
and the many uses to which it has been put over the course of our history.
We need to teach how freedom has been, in the words of the political theorist
Nikolas Rose, both a “formula of power” (as it is today) and
a “formula of resistance.”
The dominant meanings of freedom for the past generation have tended
to center on political democracy, free markets, low taxes, limited government,
and individual self-determination in private matters ranging from dress
and leisure activities to sexual orientation. These definitions are promoted
as both quintessentially American and universally applicable. Yet the
meaning of freedom and the definition of who is entitled to enjoy it have
changed many times in our past. Rather than a single fixed category inherited
from the founding fathers, freedom has always been an evolving, multifaceted,
and contested idea. Calling our past a history of freedom for everybody
makes it impossible to discuss seriously the numerous instances when groups
of Americans have been denied freedom, or the ways in which some Americans
today enjoy a great deal more freedom than others. It makes it impossible
to appreciate how battles at freedom’s boundaries – the efforts
of racial minorities, women, and other groups to secure freedom as they
understood it -- have both deepened and transformed the meaning of freedom.
The modern idea that freedom is equally an entitlement of all Americans
regardless of race, for example, owes as much to slaves and abolitionists
who insisted that liberty is a truly human ideal than to the founders,
who spoke of freedom as a universal entitlement but established a slaveholding
republic. The modern extension of freedom into private life was pioneered
by generations of feminists who insisted that the idea is applicable to
the most intimate personal relationships.
Today, if one asks your man or woman in the street to define freedom,
they will soon mention the liberties enshrined in the Bill of Rights --
freedom of expression, of the press, etc. Yet all patriotic upsurges run
the risk of degenerating into a coercive drawing of boundaries between
“loyal” Americans and those stigmatized as aliens and traitors.
Like other wars, the “war on terrorism” has raised troubling
questions concerning civil liberties in wartime, the rights of noncitizens,
and the ethnic boundaries of American freedom. It is not difficult to
list the numerous and disturbing infringements on civil liberties that
followed in the wake of September 11. Legal protections such as habeas
corpus, trial by impartial jury, the right to legal representation, and
equality before the law regardless of race or national origin were curtailed.
At least 5,000 foreigners with Middle Eastern connections were quickly
rounded up and more than 1,500 arrested and held for long periods of time
without charge or even public acknowledgment of their fate. To this date,
not a single one has been charged with involvement in the events of 9/11.
(Zaccarias Moussaoui, the so-called twentieth hijacker, was already in
custody on that day.) An executive order authorized the holding of secret
military tribunals for noncitizens deemed to have assisted terrorism,
and the Justice Department has argued in court that even American citizens
could be held indefinitely and not allowed to see a lawyer, once the government
designates them “enemy combatants.”
One “surprise” of the post-September 11 period has been how
willing the majority of Americans are to accept restraints on time-honored
liberties, especially when they seem to apply primarily to a single ethnically-identified
segment of our population. Like other results of September 11, this surprise
needs to be understood in its historical context. That history suggests
that strong protections for civil liberties is not a constant feature
of our “civilization” but a recent and still fragile historical
achievement. Our civil liberties are neither self-enforcing nor self-correcting.
Especially in times of crisis, the price of freedom is eternal vigilance.
America, of course, has a long tradition of vigorous political debate
and dissent, an essential part of our democratic tradition. Less familiar
is the fact that until well into the twentieth century, the social and
legal defenses of free expression were extremely fragile. A broad rhetorical
commitment to this ideal coexisted with stringent restrictions on speech
deemed radical or obscene. Labor activists, socialists, advocates of birth
control, campaigners for racial equality and others faced numerous legal
and extra-legal obstacles to their ability to publicize their views, hold
meetings, picket, and distribute literature. Not until the late 1930s
did civil liberties assume a central place in liberal definitions of freedom.
Not until the 1960s did the modern jurisprudence of civil liberties become
fixed in the law. Equality before the law regardless of race is a very
new principle in American life. For most of our history, Asians were prohibited
from becoming naturalized citizens, and blacks were denied many of the
basic rights of other Americans. Only in the last few years did racial
and ethnic profiling by public authorities come to be seen as illegitimate
– a position apparently reversed in the aftermath of September 11.
Civil liberties have been severely abridged during previous moments of
crisis, from the Alien and Sedition Acts in 1798 to the jailing and deportation
of socialists, labor leaders, and critics of American involvement during
and immediately after World War I, to the internment of tens of thousands
of Japanese-Americans, most of them American citizens, during World War
II, and McCarthyism during the Cold War. Historians generally view these
past episodes as shameful anomalies. But we are now living through another
such experience, and there is a remarkable absence of public outcry.
Although the Supreme Court recently moved to curtail the government’s
power to arrest individuals without charge and throw away the key, history
does not suggest that the Supreme Court is likely to offer a vigorous
defense of civil liberties against governmental infringement so long as
a war exists. In the famous Milligan case, arising out of the use of military
tribunals to try civilians during the Civil War, the Court issued the
stirring comment that the constitution is not suspended in wartime, “it
is a law for rulers and people, equally in time of war and peace.”
But this decision was issued in 1866, after the crisis had passed, just
as the Court upheld restrictions on free speech during World War I, only
to begin to defend freedom of expression during the 1920s. In once obscure
decisions now deserving of classroom attention -- Fong Yue Ting (1893),
the Insular Cases of the early twentieth century, Korematsu during World
War II -- the Court allowed the government a virtual carte blanche in
dealing with aliens and in suspending the rights of specific groups of
citizens on grounds of military necessity. We should not forget the ringing
dissents in these cases. In Fong Yue Ting, which authorized the deportation
of Chinese immigrants without due process, Justice Brewer warned that
the power was now directed against a people many Americans found “obnoxious,”
but “who shall say it will not be exercised tomorrow against other
classes and other people?” In Korematsu, which upheld Japanese-American
internment, Justice Robert Jackson wrote that the decision “lies
about like a loaded weapon ready for the hand of any authority than can
bring forward a plausible claim to an urgent need.”
This history does not offer simple lessons or a single easy answer to
current concerns about the proper balance between liberty and security.
But it does suggest that like other aspects of freedom, the right to criticize
the government, equality before the law, and legal protections against
the unfettered exercise of police powers by the state are not part of
a straight-line trajectory of continual progress with a few temporary
interruptions that are soon self-corrected. They are the inheritance of
a long history of struggles in which victories often prove temporary and
retrogression often follows progress. As the abolitionist Thomas Wentworth
Higginson remarked at the end of the Civil War, “revolutions may
go backwards.” Recent infringements on civil liberties do not compare
with the massive suppression of dissent during World War I or the internment
of Japanese-Americans. But recent events do mark a significant shift in
public policy after several decades of expanding liberty.
September 11 will also undoubtedly lead historians to examine more closely
the history of the country’s relationship with the larger world.
We are constantly being reminded that the world we inhabit is becoming
smaller and more integrated and formerly autonomous nations are bound
ever more tightly by a complex web of economic and cultural connections.
The popular short-hand term for these processes is globalization.
Our heightened awareness of globalization – however the term is
delimited and defined – should challenge historians to become more
cognizant of how our past, like our present, is embedded in a history
larger than our own. The institutions, processes, and values that have
shaped American history -- from capitalism to political democracy, slavery,
and consumer culture -- arose out of global processes and can only be
understood in an international context. This, of course, is hardly a new
insight. Back in the 1930s, Herbert E. Bolton warned that by treating
the American past in isolation, historians were helping to raise up a
“nation of chauvinists” – a danger worth remembering
when considering the drumbeat of calls for a self-absorbed patriotic history.
A year and a half before September 11, in my presidential address to
the American Historical Association, I called on scholars to deprovincialize
the study of American history. Internationalizing our history does not
mean abandoning or homogenizing the particular experience of the United
States. International dynamics operate in different ways in different
countries. In internationalizing American history we must also be careful
not to reproduce traditional American exceptionalism on a global scale
-- such as in the statements quoted above equating civilization with “the
West” and “the West” with the United States. This is
a special temptation in the wake of September 11, which has produced a
spate of historical commentary influenced by Samuel P. Huntington’s
mid-1990s book, The Clash of Civilizations. It is all too easy to explain
September 11 as a confrontation between Western and Islamic civilizations
(a position oddly reminiscent of that of Osama bin Laden).
But the notion of a “clash of civilizations” is monolithic,
static, and essentialist. It reduces politics and culture to a single
characteristic -- race, religion, or geography -- that remains forever
unchanged, divorced from historical development. It denies the global
exchange of ideas and the interpenetration of cultures that has been a
feature of the modern world for centuries. It also makes it impossible
to discuss divisions within these purported civilizations. The construct
of “Islam,” for example, lumps over one billion people into
a single “civilization,” and makes it difficult to explain
why Iran and Iraq went to war. The idea that the West has exclusive access
to reason, liberty, and tolerance, ignores both the relative recency of
the triumph of such values within the West and the debates over Creationism,
abortion rights, and other issues that suggest that commitment to such
values is hardly unanimous. Many self-proclaimed defenders of the superiority
of Western civilization fail to notice that the Western tradition of their
imagination is highly selective -- it includes the Enlightenment but not
the Inquisition, liberalism but not the Holocaust, Charles Darwin but
not the Salem witch trials. The difference between positing civilizations
with unchanging essences and analyzing change within and interaction between
various societies is the difference between thinking mythically and thinking
historically.
It certainly seems to be true that the various ideas of freedom with
which we are familiar have not sunk deep roots in Islamic societies. But
like everything else, terror itself has a history. To explain terrorism
as the inevitable outcome of the innate pathologies of Islamic civilization
ignores the fact that many societies, including our own, have spawned
terrorists. The Ku Klux Klan during Reconstruction murdered more innocent
Americans than Osama bin Laden. In the first two decades of the twentieth
century, Americans experienced a wave of terrorist attacks and bombings
-- the assassination of President McKinley by an anarchist in 1901, the
1910 explosion at the Los Angeles Times that killed twenty persons, the
Wall Street bombing of 1920 that took thirty-eight lives. The Oklahoma
City bombing of 1995 and the post-9/11 circulation of anthrax through
the mails were both initially attributed to foreign terrorists, yet both
appear to have been home grown. The point is not to deny the unprecedented
scale of the September 11 attacks or to denigrate the achievements of
American and Western societies, but to underscore that terrorism springs
from specific historical causes and can emerge in many times and places.
Its roots require historical analysis
Ironically, September 11 highlighted not only our vulnerability but our
overwhelming power. Never, perhaps, since the days of the Roman empire
has one state so totally eclipsed the others. In every index of power
-- military, economic, cultural, scientific -- the United States far exceeds
any other country. It accounts for just under one-third of the world’s
gross domestic product, 36 percent of all military spending (more than
the next several powers combined), and 40 percent of world spending on
scientific research. It is not surprising in such circumstances that many
Americans feel that the country can establish rules of international conduct
for others, while operating as it sees fit. Since September 11, the word
“empire” has come back into unembarrassed use in American
political discourse. The need to shoulder the burdens of empire is a common
theme in discussion among the foreign policy elite, and in a number of
popular books. Even “imperialism,” once a term of opprobrium,
is now in common use.
Like other responses to September 11, the idea of the United States as
an empire has a long history, one linked to the belief that the country
– by example, force, or a combination of the two – can and
should remake the world in its own image. Jefferson spoke of the United
States as an “empire of liberty.” When the nation stepped
onto the world stage as an imperial power in the Spanish-American War,
President McKinley insisted that ours was a “benevolent imperialism,”
and that our governance of the Philippines ought not to be compared to
the territorial conquests of European powers. Woodrow Wilson insisted
that only the United States possessed the combination of military power
and moral righteousness to make the world safe for democracy. In 1942,
Henry Luce, the publisher Time and Life magazines, called for the United
States to assume the role of “dominant power in the world”
in what he famously called “The American Century.”
The history of the idea and practice of empire might help Americans understand
why other countries sometimes resent our tendency to pursue our own interests
as a world power while proclaiming that we embody universal values and
goals. A recent Gallup poll revealed that few Americans have any knowledge
of other countries’ grievances against the United States. But the
benevolence of benevolent imperialism lies in the eye of the beholder.
Indians and Mexicans did not desire to surrender their lands to the onward
march of Jefferson’s empire of liberty. Many Filipinos did not share
President McKinley’s judgment that they would be better off under
American rule than as an independent nation. A study of the history of
our relationship with the rest of the world might enable us to find it
less surprising that despite the wave of sympathy for the United States
that followed September 11, there is widespread fear outside our borders,
including among longtime allies in Europe, that the war on terrorism is
motivated in part by the desire to impose a Pax Americana in a grossly
unequal world.
Local situations and complex motives throughout the world cannot be subsumed
into a single either/or dichotomy of friends and enemies of freedom or
terrorists and their opponents. At a time when half the college history
departments in the country lack a faculty member capable of teaching the
history of the Middle East, it is worth remembering that anti-Americanism
in that part of the world is a recent phenomenon, not primordial hatred,
and that it is not confined to Islamic fundamentalists but can be found
among secular nationalists and democratic reformers. It is based primarily
on American policies -- toward Israel, the Palestinians, oil supplies,
the region’s corrupt and authoritarian regimes, and, most recently,
Iraq. It is not simply American freedom, but American power and its uses,
that arouses international suspicion.
At the height of the Cold War, in his brilliant and sardonic survey of
American political thought, The Liberal Tradition in America, Louis Hartz
observed that despite its deepened worldwide involvement, the United States
was becoming more isolated intellectually from other cultures. A few years
ago, another prominent historian, Daniel Rodgers, contrasted the Progressive
era, when American reformers scoured Europe for examples of social policy
that could be adopted in the United States, with the 1990s, when Americans
seemed to be convinced that they had nothing to learn from the rest of
the world. September 11 has produced an odd combination of cosmopolitanism
and myopia – a recognition that we exist as part of a wider world,
and demands that we once again emphasize what sets us apart from the rest
of mankind.
When Alexis de Tocqueville visited the United States in the 1830s, he
was struck by Americans’ conviction that “they are the only
religious, enlightened, and free people,” and “form a species
apart from the rest of the human race.” Yet American independence
was proclaimed by men anxious to demonstrate “a decent respect to
the opinions of mankind.” It is not the role of historians to instruct
our fellow citizens on how they should think about our turbulent world.
But it is our task to insist that the study of history should transcend
boundaries rather than reinforcing or reproducing them. In the wake of
September 11, it is all the more imperative that the history we teach
must be a candid appraisal of our own society’s strengths and weaknesses,
not simply an exercise in self-celebration – a conversation with
the entire world, not a complacent dialogue with ourselves. If September
11 makes us think historically -- not mythically -- about our nation and
its role in the world, then perhaps some good will have come out of that
tragic event.
 |
|
|