“The Shame Is Not Ours”: Roots, Episode 1

Roots

Erica Armstrong Dunbar is the Blue and Gold Professor of Black Studies and History at the University of Delaware, and she directs the program in African American history at the Library Company of Philadelphia. She is also an OAH Distinguished LecturerHer forthcoming book, Never Caught: Ona Judge, the Washingtons’ Runaway Slave, will be published early next year.

Hollywood is onto something.

Historians, including myself, don’t usually make this kind of comment. Typically we tear Hollywood apart, calling out the historical inaccuracies of television shows and flailing against overdramatized films. But tonight, America was reintroduced to a television phenomenon that restored a bit of my faith in the film industry. On May 30, a new version of the epic television series Roots began airing on the History channel, A&E network, and those of us who study the institution of slavery in America, well, we were captivated.

At the recent conference, “The Future of the African American Past,” the renowned historian and legal scholar Annette Gordon-Reed prompted everyone in the room to think about the influence of recent events on the way we write history. The killings in places like Sanford, Florida; Ferguson, Missouri; and Charleston, South Carolina have had a deep and profound impact on historians, writers, and, so it seems, directors and producers. Older narratives about African American history are no longer acceptable to a younger generation of viewers and readers. Fortunately for them, we have more than forty years of groundbreaking scholarship that has changed everything. A new era of film and television now presents shows like Underground, Roots, and soon The Birth of a Nation. The terms have been reset for how we interpret and represent the lives of the enslaved. Continue reading

Facebooktwittergoogle_plusredditmail

This Week on Process: Roots

Roots2016PromotionalPosterAs you may know, the History channel, A&E network, has produced a new miniseries based on Alex Haley’s classic novel, Roots: The Saga of an American Family, originally published in 1976 and made into an award-winning miniseries that aired in 1977.

We are pleased that Erica Armstrong Dunbar has agreed to watch and comment on the series for Process. Her posts—one for each of the 4 episodes—will appear following each East Coast broadcast, beginning on Memorial Day, May 30.

Dunbar and colleagues Kellie Carter Jackson, Daina Berry, and Jessica Millward will also participate in an AMA (“Ask Me Anything”) at Reddit’s AskHistorians forum on Friday, June 1, aiming to field questions about the history of American slavery, the slave trade, and the representations of slavery.

Watch this space!

Facebooktwittergoogle_plusredditmail

Sands of War: Patton’s Desert Training Center on Film

DTC-title shot

Matt C. Bischoff is currently a state historian with California State Parks and has worked in a variety of settings throughout the West, including with several consulting firms, on a U.S. Air Force base, as an expert witness, and now as a production advisor and on screen “talent.”

Matt C. Bischoff is currently a state historian with California State Parks and has worked in a variety of settings throughout the West, including with several consulting firms, on a U.S. Air Force base, as an expert witness, and now as a production advisor and on screen “talent.”

Once it became clear that Americans would be fighting in the deserts of North Africa during World War II, a suitable place to train them was essential. The celebrated armor commander, General George S. Patton Jr., was called upon to identify and develop such a place. He found it in the Colorado and Mojave Deserts of southern California. Largely uninhabited, with extremely rugged terrain and climate, this was the perfect place to “harden” American troops for what they would face when confronting the more experienced Axis troops. The Desert Training Center (DTC) opened in spring 1942 and expanded far beyond its original borders and scope before it closed in 1944. It covered approximately 18,000 acres including much of the southern California deserts, the southern tip of Nevada, and a significant portion of western Arizona. Even after the successful end of the North African campaign in May 1943, training in the DTC continued to mimic wartime conditions, with none of the comforts offered by established military bases. Soldiers lived in temporary tent camps while being subject to strict rationing of water and other supplies. Large unit training maneuvers covered large swaths of the desert.

Today, most of the land encompassing the training grounds is overseen by the U.S. Bureau of Land Management (BLM). Surprisingly, despite its temporary and ephemeral nature, there are many remains from the massive facility. Camp alignments remain clear in many places; unit symbols spelled out with rocks are still visible; airstrips can be spotted; and mock battlefields mark the landscape in key strategic locations.

Over the past 10 years, with the push for more “green” energy, the rush to use the desert for large-scale solar power generation has truly accelerated. These solar power projects have had enormous impacts on natural and cultural resources, including the remains of Patton’s training center. In this case, a mitigation agreement worked out between regulators and solar energy companies required that a film be produced to document this little-known historical site. Full Frame Productions from San Francisco, a firm with a broad array of educational documentaries in their repertoire along with a variety of other production experience, won the contract and the short film they created, entitled Sands of War, will air on PBS affiliates across the country during Memorial Day weekend. It is also available for viewing on-line. Continue reading

Facebooktwittergoogle_plusredditmail

Podcast: History and Presence

podcast

In the May 2016 installment of the Journal of American History podcast, Ed Linenthal, executive editor of the Journal of American History, speaks with Robert Orsi, the first holder of the Grace Craddock Nagle Chair in Catholic Studies at Northwestern University. In this episode they discuss Professor Orsi’s new book, History and Presence. (40 minutes)

Subscribe to the podcast through iTunes.

Facebooktwittergoogle_plusredditmail

Money Makes History: Harriet Tubman and the Face Value of U.S. Currency

Lauer_josh_may 2016

Josh Lauer is an associate professor of media studies at the University of New Hampshire. His forthcoming book, a history of consumer credit surveillance in the United States, will be published by Columbia University Press.

“Who is Harriet Tubman?” This was the most popular Google search on April 20, 2016, the day the U.S. Treasury announced a series of new designs for the nation’s currency. The revamped notes will include significant changes to the backs of the $5 and $10 bills. The faces of Marian Anderson, Eleanor Roosevelt, and Martin Luther King, Jr., will appear opposite Abraham Lincoln on the $5, and marching suffragists and five feminist heroines will appear opposite Alexander Hamilton on the $10. However all of these flipside modifications were completely overshadowed by a frontstage coup. Andrew Jackson will be replaced by Harriet Tubman on the face of the $20. It was this news that sent thousands googling.

Tubman of course is the former slave and abolitionist whose heroism as a conductor of the Underground Railroad and Union spy is legendary. Her ascent to the $20 – and Jackson’s demotion to the back of the bill—has been hailed by many as a step toward rectifying historical wrongs. Sexism in particular. Since the first greenbacks came off the presses in 1861, the faces of U.S. currency have been overwhelmingly male. Indeed the nation’s paper money is a veritable pantheon of scowling, hirsute men – founding fathers, presidents, generals, senators, cabinet members, inventors (Robert Fulton and Samuel Morse), and even an Italian mercenary (Christopher Columbus). It was the absence of women that animated grassroots efforts to change the Treasury’s long-standing designs.

To say that women have never appeared on U.S. paper currency is not quite correct. Martha Washington graced the front of a late-nineteenth-century silver certificate, and Pocahontas appeared in vignettes on some of our earliest legal tender. These exceptions notwithstanding, it is more accurate, and also more insulting, to note that real women have been missing from the nation’s currency. Images of allegorical women—Freedom, Liberty, Victory, and Columbia—regularly appeared on U.S paper money prior to its standardization in 1929. These idealized nymphs, neither individuals nor even human, can be seen as a double exclusion of actual women. In this light, the redesigned $20 is even more remarkable. Tubman will be the first non-imaginary woman to appear on the nation’s currency in more than a century.

lauer1

Tubman is not the first woman on the $20. An allegorical Liberty appears on the $20 denomination of the Demand Notes of 1861, the inaugural issue of U.S. national currency. Source: Wikipedia (public domain), attributed to National Museum of American History (Image by Godot13).

Though public reaction to the redesigned notes has been positive, not everyone is happy. Tubman after all is not just a woman; she is an African-American woman. At a moment in American history when racial discord has resurfaced with fresh intensity, this is no small detail. Defending the $20 status quo, Republican presidential candidate Donald J. Trump condemned the Tubman redesign as “pure political correctness.” Going further, Fox News analyst Greta van Susteren accused the Obama administration of “dividing the country” with its “dumb” decision to replace Jackson. For all of Old Hickory’s military prowess and common-man appeal, his support for slavery and role in the genocidal deportation of native peoples merely underscores Tubman’s significance. What began as an effort to make women visible is now about the race of our money too.

Still, looking past the historical justifications and emotion evoked by the new designs, we might ask a more pointed question. Why do we even bother putting portraits on our paper money? Everyone knows that our currency is a fiction. It is not backed by gold or silver or any earthly commodity. The printed slips are inherently worthless – as in, they are literally worth nothing as material objects. Yet drop a $20 bill on the sidewalk and it will magically disappear. The real value is not in the paper or the visages that adorn it. It is in the official promise printed directly on all denominations of the thing itself: “This note is legal tender for all debts, public and private.” Reduced to bare essentials, paper money is nothing more than a contract bearing the facsimile signatures of government bureaucrats. Continue reading

Facebooktwittergoogle_plusredditmail

This Muslim American Life

Mousafa Bayoumi near his home in Brooklyn, New York, USA. photo by Neville Elder

Moustafa Bayoumi, a professor of English at Brooklyn College CUNY, is the author of How Does It Feel To Be a Problem?: Being Young and Arab in America (Penguin), which won an American Book Award and the Arab American Book Award for Nonfiction. His latest book, This Muslim American Life: Dispatches from the War on Terror, was recently published by NYU Press. Photo by Neville Elder

Check out Keith Feldman’s review of This Muslim American Life in the May 2016 issue of The American Historian.

Could you briefly describe This Muslim American Life?

My book examines the multiple ways that the War on Terror has affected Muslim Americans. The War on Terror is really producing a culture of its own, much in the same way that there used to be a Cold War culture in this country. The Cold War transformed our politics, our laws, our military, our television shows and our movies, and all of those elements fed continuously off of each other. War on Terror culture does something similar, but with very specific consequences for Muslim Americans. Since 2001, Muslim Americans have been subject to mass arrests and deportations, elevated levels of vigilante violence, bouts of national hysteria around our houses of worship, and much more. In the meantime, Muslim American history is almost completely forgotten while Muslim Americans themselves are caricatured in our media and our politics and are always associated with terrorism. This Muslim American Life is an attempt to analyze this War on Terror culture from the perspective of Muslim Americans. The book is divided up into four sections—history, theory, politics, and culture—with the prevailing idea being that this War on Terror culture is created by the amalgamation of these various elements.

Why did you decide to write this book? Is it intended for a particular audience?

The War on Terror is discussed every day. It fills our newspapers and televisions screens. And yet, we still don’t have a good sense of how it structures our lives or the kind of political ecology it continues to create. What’s at stake is no less than our professed values of equal treatment under the law. So I felt it was necessary to write about these things and to bring the vantage point of Muslim Americans to general readers. But I think it’s important to underscore that I’m not entering these discussions as a victim but as a member of American society. We all lose if the rights of Muslim Americans can easily be trampled upon. One has to be able to talk about the consequences of politics. I do so not to evoke sympathy from a reader but to call out that which I think is wrong. I wrote this book, in other words, not solely out of concern for Muslim Americans but primarily because we should all care when a community is singled out collectively for suspicion and blame. This is a problem that should be faced squarely by all Americans.

In your book, you are interested in dissecting the worldview that imagines all Muslims to be “potential” terrorists. What kinds of historical comparisons seem useful (or not useful) to you for exploring the effects of this racial and religious-based cultural fiction?

To some degree, all bigotries are premised on fictions of a potential threat, but there are moments when the threat is seen as imminent and other times when it is understood as potentially happening at some unknown time in the future. In the lead up to Japanese Internment during World War II, we can find a disturbing similarity. The known fact that there had been zero acts of sabotage on the part of Japanese Americans following the attacks on Pearl Harbor was used as evidence that they must therefore be readying an assault for the future.  On February 21, 1942, Earl Warren, the Attorney General of California at the time, testified in front of a Congressional Committee that

many of our people and some of our authorities and, I am afraid, many of our people in other parts of the country are of the opinion that because we have had no sabotage and no fifth column activities in this State since the beginning of the war, that means that none have been planned for us. But I take the view that that is the most ominous sign in our whole situation. […] I believe that we are just being lulled into a false sense of security and that the only reason we haven’t had disaster in California is because it has been timed for a different date. [… ] Our day of reckoning is bound to come in that regard. When, nobody knows, of course, but we are approaching an invisible deadline.

As the historian Roger Daniels writes in Prisoners Without Trial, “as foolish as this argument sounds, it convinced many Americans” (p. 37). Today, Muslim Americans are often discussed similarly. Policies, such as the NYPD’s blanket surveillance program on Muslims in the New York area, are premised on the idea that Muslims will, at some vague point in the future, become terrorists, so the NYPD ought not be lulled into a false sense of security and must stop Muslims from actions they haven’t even yet contemplated. In testimony delivered to a Senate committee in 2007, Lawrence Sanchez, the architect of the NYPD surveillance program, said:  “Rather than just protecting New York City from terrorists, the New York Police Department believes that part of its mission is to protect New York City citizens from turning into terrorists.”

But we should also look at our contemporary moment and not only back in history to understand how pervasive and pernicious this way of thinking is. We live in an age premised more than ever on this kind of preemptive and predictive modeling. From the Bush Administration’s doctrine of preemptive war in Iraq to the belief that massive data collection can algorithmically predict individual human behavior, we are slowly strangling the possibilities of life and presumptions of innocence by the misguided belief that eliminating uncertainty will make our futures more secure. Continue reading

Facebooktwittergoogle_plusredditmail

Black Lives (and) Matter: George Zimmerman’s Gun and Artifacts of Racial Violence in American History

Strang, Cameron 20123562_1024

Cameron B. Strang is an assistant professor of history at the University of Nevada, Reno, and will be a scholar in residence at The Institute for Advanced Study in 2016-2017. His articles include prize-winning pieces in The William and Mary Quarterly and The Journal of American History, and he is co-editor of a special edition of Early American Studies on the environment in early America.

On May 11, 2016, Florida man George Zimmerman put the handgun he used to kill the unarmed black teenager Trayvon Martin up for auction, advertising the weapon as “a piece of American History.”[1] There has been fierce debate over whether Zimmerman should have been forgiven for killing Martin in 2012 (he was exonerated according to Florida’s Stand Your Ground law), but Zimmerman can perhaps be forgiven for believing that his weapon is just the sort of thing that would fascinate history buffs for generations to come. This is because stories inhere in things—including particular weapons and the remains of those that they kill—and Americans have a habit of collecting and displaying things that seem to embody stories they want to tell about themselves and their country. And it makes sense that many of these objects and stories are about racial violence because it is a central thread of the American story.

Zimmerman and his gun have aggravated an old tension in America’s national story, a tension between casting America as a melting pot where ethnic identities blend to create something new and better, and an America in which individuals and groups take pride in maintaining boundaries that define them as an ethnically or racially distinct people. Trayvon Martin’s killing—along with the deaths of several other black men at the hands of white police officers over the last few years—has been a spark in the Black Lives Matter movement, in which a sense of racial and socioeconomic difference has fueled political action. This movement has led some Americans to take up the melting-pot narrative and declare that “all lives matter,” a claim that many activists for black justice believe misses the point. Zimmerman, for his part, pledges to use a portion of the proceeds from the sale of his gun to “fight BLM [Black Lives Matter] violence against Law Enforcement officers,” reassigning the blame for racial violence away from white cops and (back) onto blacks themselves.[2]

Americans have a long history of using objects to tell stories about racial violence as a way to emphasize their own identities. This is particularly evident in Zimmerman’s home state of Florida. During the Second Seminole War (1835–1842), a conflict in which Anglo-Americans aimed to remove Indians from Florida to make the peninsula safe for plantation slavery, whites and Indians collected and circulated each others’ remains and used them to tell stories that defined Florida’s Natives as a distinct ethnic group of “Seminoles.” For Natives, the circulation and display of Anglo scalps helped tell a story about their pride in resisting white domination that, combined with the ritual uses of scalps, contributed to their ethnogenesis. Anglos, for their part, collected Indian skulls as war trophies and, sometimes, circulated these remains to Euro-American craniologists and phrenologists. As these white experts analyzed Florida crania, they defined Seminoles as a group whose cranial dimensions seemed to tell a story of predestination, that Seminole brains were inherently prone to violence. As much as we’d like to believe that interactions among peoples leads to deeper understanding, multiethnic encounters in America have just as often convinced various groups of their irreconcilable differences.[3] Continue reading

Facebooktwittergoogle_plusredditmail

The Historical Foundations of Abundance

Donald Worster is the Hall Distinguished Professor Emeritus of History at the University of Kansas and Distinguished Foreign Expert in the School of History at Renmin University of China, where he teaches world and environmental history. He is past president of the American Society for Environmental History and the Western History Association, a member of the American Academy of Arts and Sciences, and the author of nine other books of history, including the Bancroft Prize-winning Dust Bowl: The Southern Plains in the 1930s.

Historians are so obsessed with discerning or making new “turns” these days that it may seem our understanding of the past is running in circles. We recycle old ideas and old debates in an endless quest for the truth. One of the latest of these shifts in historical interpretation is the “material turn,” which brings us back to older arguments that cultural reasoning or abstract ideas do not determine everything in the human story. Material conditions matter. But this time we are discovering something new: old notions of materiality, focused on technology and production, class interests, debtors and creditors, do not exhaust the possibilities. In fact they may be only superficial manifestations of deeper material conditions. Such powerful forces as climate change, soil fertility, microorganisms, and indeed the evolving complexity of the ecosphere within which all societies, ancient and modern, have existed may be the most fundamental engines of history.  We cannot separate human aspirations and failures from the natural environment or the laws of matter and energy.

Perhaps the single most compelling idea behind the history of the United States, as David Potter told us more than a half-century ago in his book People of Plenty: Economic Abundance and the American Character (1954), has been the notion of abundance and of the infinite economic growth such abundance long seemed to promise. It has been at the core of American thinking for centuries, and now it has gone global, driving almost all societies as much as it ever did Americans. Economists may debate whether “growth” can continue indefinitely into the future or not, but typically they agree with Potter that the idea has been a defining one for Americans. But typically the economists, like Potter, fall silent when they confront the question, where does the idea of abundance or growth come from? How, where, and why did it become so persuasive to so many people? And if it had material foundations, are those foundations beginning to disappear in our time?

Recently, economists have been talking a lot about the decline of American growth rates since around 1970. Robert J. Gordon, for example, in his widely admired new book The Rise and Fall of American Growth: The U.S. Standard of Living since the Civil War (Princeton University Press, 2016), warns that a hundred-year spurt of growth is coming to an end and there is nothing that policy makers or social scientists can do to stop the process. He has compiled the all-time greatest catalog of “advances made” in energy consumption, clothing, housing, communications, and transportation over a century of American “progress.” But why that century ever occurred, or why it is now running out of steam, eludes his grasp. Somehow, he maintains, it was the mysterious force of “technology” that once produced the miracle of growth, and now that technology is faltering, giving smaller and smaller returns, leading us back into an older pattern of far slower, more inconsequential, and more inconspicuous change.

Gordon is a “materialist,” make no mistake, but I would argue that he is not material enough. What is missing from his narrowly conceived history of innovation is any awareness of how ecological conditions in America were at least as important as the fabled inventors of railroads or steel mills in making us a “people of plenty.” And today it is changing ecological conditions that explain—better than a loss of American inventiveness—that winding down of growth. Continue reading

Facebooktwittergoogle_plusredditmail

Skill Building through Game Building in a Public History Class

Phillip Payne is a professor of history and department chair at St. Bonaventure University where he teaches a variety of courses in United States History. He also teaches undergraduate courses in public and digital history. He is the author of Dead Last: The Public Memory of Warren G. Harding’s Scandalous Legacy (2009) and Crash! How the Economic Boom and Bust of the 1920s Worked (2015).

In recent years, defenders of the humanities and proponents of studying history often stress that it improves a student’s reading, writing, and analysis. While this is true, professors often only teach these skills implicitly, failing to explicitly communicate to their students that a key value of studying history is learning how to learn—the ability to identify and adapt to the rapid change that is the hallmark of the twenty-first century. If these skills remain vague and implicit, how does a young history major explain them to potential employers? This past semester we experimented with giving students marketable skills and the language to explain them.

Eddie Keen explored a classic board game theme – railroads. Players could explore the logistics of moving troops and supplies during the war. Eddie provided a great deal of historical information to inform player’s decisions.

Eddie Keen explored a classic board game theme – railroads. Players could explore the logistics of moving troops and supplies during the war. Eddie provided a great deal of historical information to inform player’s decisions.

For a number of years I’ve been teaching undergraduate public and digital history classes with Dennis Frank, University Archivist. The public history class, in particular, has always contained discussion of career options related to history and some public history projects based on archival work. But last year, we decided to overhaul the class to place greater emphasis on how the assignments related to the student’s marketability. In overhauling the class we sacrificed some discussion of types of public history in favor a large project that incorporated iterative design, information organization, group work, presentations, and different types of writing. To get there, we decided to build a game design assignment. This allowed us to look at gamification and the increasing role of games and gamification in a wide variety of occupations including education. To underscore our point, we gamified the assignment itself, having students pursue levels and goals rather than traditional grades (although we did have to translate them to grades ultimately). Students would both design a game while participating in a gamified class. On the assignment, I spelled out how “the game design project will develop directly marketable skills while building your game(s), including: group work, peer review, innovation, problem solving, design & prototyping, and project management.” Most of these are recognizable, transferable, skills we expect history majors to develop. Here we spelled them out.

Focusing on game design also allowed us to address how learning and commerce work on the internet. People play a lot of games online, sometimes without realizing it. Gamification is a twenty-first century phenomenon that one doesn’t often associate with history courses. Indeed, the word gamification does not have much of a history, only coming into existence a few years ago. However, gamification and game design presented some challenges. An obvious hook for students is the lure of video games. However, expecting students to know how to code or asking them to master game engine software in one semester was unrealistic. Instead, we decided to have them build board games. Board games would allow them to explore the underlying ideas of gaming and the importance of gaming in the information age without mastering complicated software. Perhaps ironically, the internet has sparked a renaissance in board gaming. The rise of Eurogames (games built using mechanics such as indirect competition and player cooperation popular in Germany) in the United States opens the door to a more diverse and rich board gaming experience. Game design allowed us to retain many of the important features of other assignments—integrating content, archival research, considering audience, display, iterative design— and added a few cool elements. Game design lent itself to a discussion of the psychology of learning and decision making. Players have to make meaningful choices, otherwise the game isn’t interesting. Player choice lent itself to a discussion of causation in history—that people’s actions had consequences.

Game design has a rich, robust body of literature across many disciplines that address using games to teach problem solving and design. Bringing games into the classroom is not new. The History Teacher has several articles on teaching with games. The Chronicle of Higher Education has had recent articles on gamification in education. Oregon Trail is a classic and Assassin’s Creed is showing up in classrooms. The popularity of video games has driven the recent rise of gamification. Gamification provides students a skill they can list on LinkedIn that might open doors in a variety of careers.

wesolowski

Mike Wesolowski combined elements of a traditional board game with a role playing game to give players more of a sense of the decisions members of the regiments faced.

As it turns out, another advantage is that game designers like to put useful material online. When preparing for the class we took a Coursera MOOC “Gamification” from the Wharton School of Business and listened to podcasts, most notably the Game Design Roundtable. For the class the EdX MOOC “Introduction to Game Design” served as the primary text supplemented with additional materials. In the EdX MOOC, MIT professors focused on the development of educational games. These are just a few of a growing list of resources that place game design within a broader context of business, education, and social change.

We kept coming back to the idea that the twenty-first century belongs to the lifelong learner, a notion reinforced by our exploration of MOOCs and podcasts. As we designed the class we kept in mind widely circulated lists of skills and attributes that employers are looking for. When writing the syllabus, rather than using history jargon, we used terms appropriate for LinkedIn. “Preparing drafts” evolved into “collaborative iterative design.” “Organizing research” evolved into “Information Architecture.” I wrote on the syllabus “We live in a world awash in information, most of it digital or electronic but not all. Often now the issue is not finding the information, but organizing, sorting, synthesizing, and verifying information. These are skills that history majors often intuitively develop. In this class we will make them explicit as we move between archives, secondary literature, and design process.” Continue reading

Facebooktwittergoogle_plusredditmail

Between Memory and History: U.S.-Cuba Rapprochement in a Time of Remembrance

obama_castro

President Barack Obama and Cuban President Raúl Castro at a welcoming ceremony for Obama in Havana, Cuba on March 21, 2016.

[From the May issue of The American Historian]

Louis A. Pérez Jr. is the J. Carlyle Sitterson Professor of History and the Director of the Institute for the Study of the Americas at the University of North Carolina at Chapel Hill. His most recent book is The Structure of Cuban History: Purpose and Meanings of the Past (2013).

The premonition of a historic moment often contemplates the imminence of change, a sense of crossing thresholds of before and after, even if the “before” is unknown and the “after” is unknowable. December 17, 2014, was one such moment, the date on which President Barack Obama announced the resumption of U.S.-Cuba relations. A “historic announcement,” pundits and policymakers uniformly agreed, a “historical change in U.S. relations with Cuba” promising “a historic new era.” This was the United States pushing history forward, the United States “making historic changes” and initiating a “historic overhaul of relations” among President Obama’s “trophy legacies” based on a “bold decision to chart a new course in U.S.-Cuban relations.” By traveling to Cuba in March 2016, President Obama “stepped into history” and “made history,” seizing a “historic opportunity” during his “historic three-day visit” to Cuba, where he delivered “a historic speech to the Cuban people.”

Traces of hubris, to be sure, and evidence of the ways people often live hermetically sealed within their own history. But self-congratulations of an American “historic achievement” implied more than the performance of a self-contained history. The narratives also served to lift aloft a larger history, informed with a point of view as a matter of discursive intent and into which was inscribed the plausibility of the American purpose, that is, a politics. The narratives anticipated the history through which to problematize U.S.-Cuba relations, a way to merge myth and memory into a usable past, thereupon to situate the United States as subject and Cuba as object, the Americans as actors and the Cubans as acted upon. In the United States, the past fifty-five years of U.S.-Cuba relations are remembered as a breezy blur of some type of discord, recalled variously as a time of “chilly relations” (CNN), or “decades of frigid relations” (ABC), or “years of unfriendly relations” (Yahoo). The Americans remember the past fifty-five years through their intentions. “The United States has supported democracy and human rights in Cuba through these five decades,” President Obama recounted, a policy “rooted in the best of intentions.” The past half-century was an “aberration,” the President insisted, all in all, a mere “chapter in a longer story of family and friendship.” [i]

The Cubans were at the receiving end of American intentions. A half-century of “chilly relations” was experienced in Cuba as a condition of prolonged siege, of single-minded American resolve at regime change: one armed invasion, scores of assassination plots, years of covert operations, and decades of punitive economic sanctions. An embargo—“harsher than on any other countries in the world,” acknowledged Assistant Secretary of States Roberta Jacobson in 2015—designed with malice of forethought to inflict adversity upon the Cuban people and to deepen Cuban discontent through economic privation in the hope that hardship would serve to bestir the Cuban people to rise up and overthrow of Cuban government.

For Cuba the conflict with the United States has less to do with the Cold War than with the long history of U.S.-Cuba relations.

Conflicting memories of the past loom large in the process of rapprochement. Indeed, the United States and Cuba renew relations by way of profoundly different understandings of the history from which the historical present has emerged. The United States envisages normalization as ending antagonisms borne of East-West tensions, the restoration of relations ruptured as a result of the Cold War. The time had come, President Obama exhorted, “to leave behind the ideological battles of the past.” The “policy of isolation,” President insisted, was “long past its expiration date,” a policy “designed for the Cold War [that] made little sense in the 21st century.” Obama was determined to dispose of “the last vestige” of the Cold War. “I have come here,” the President proclaimed upon his arrival to Havana in March 2016, “to bury the last remnant of the Cold War in the Americas.”

The Cold War as temporal framework through which to historicize the U.S.-Cuba conflict is not without far-reaching policy implications. It serves to challenge the legitimacy of the Cuban revolution as something of a Cold War anachronism, a way to diminish the moral authority of a government presiding over a system deemed bereft of hope and discredited by history, thereupon to invite the inference that the Cuban regime was also “long past its expiration date.” The Cold War as cause and context of ruptured U.S.-Cuba relations, moreover, serves to efface all the history that preceded it, to foreclose the relevance of the larger history from which U.S.-Cuba relations entered the Cold War phase in the first place. “The Cold War has been over for a long time,” President Obama pronounced at the Summit of the Americas in Panama in April 2015. “And I’m not interested in having battles that, frankly, started before I was born.” Nor was the President interested in the history of U.S.-Cuba relations recounted by President Raúl Castro at the Summit. “If you listened to President Castro’s comments earlier this morning,” Obama remarked later in a press conference, “a lot of the points he made referenced actions that took place before I was born, and part of my message here is the Cold War is over.” The “actions that took place” before the President was born are thus consigned to the dustbin. “It is time, now, for us to leave the past behind,” Obama exhorted. Continue reading

Facebooktwittergoogle_plusredditmail