American Quarterly. Johns Hopkins University Press. 46.2 (1994): 123-138. Reprinted with permission.
This talk is dedicated to the memory
of two past presidents of the American Studies Association,
Carl Bode and Russel B. Nye, both of whom passed away in 1993.
We must remember that Carl and Russ were both young men--in
their thirties or early forties-- when they began ASA and yet,
until their deaths, they exemplified the populist,
multidisciplinary, experimental, adventurous, eclectic,
politically progressive, and antidiscriminatory principles that
characterize the best of ASA.
CHANGE: IT WAS THE BYWORD OF A RECENT PRESIDENTIAL CAMPAIGN. IT is the theme song of our current convention. It is what is happening to the American Studies Association right now as our membership approaches 4,500. A decade ago, we numbered only 1,800.
The theme of our convention, "Cultural Transformations/Countering Traditions," emphasizes the changing nature of our disciplines, but those who put together the program had to deal with still other aspects of change. Thadious Davis and her committee received more than 650 panel, paper, and performance proposals from which they assembled the largest program in ASA history. With over 800 people participating, we can have papers on archives, Melville, the James family, Du Bois, Cathy N. Davidson is professor of English at Duke University and president of the American Studies Association.Boston Brahmins, the Tennessee Valley Authority, and slave narratives■subjects we have long recognized as part of "American studies." We can also hear presentations on Turkish pants and Tiffany hoods; Japanese Elvis impersonators; Shamu the Whale; the lesbian subject in Arab culture; Irish and Cambodian connections in Lowell, Massachusetts; the "white" problem in American studies; Pueblo figurative ceramics; low-rider cars in northern New Mexico; Haitian Rara celebrations; American Orisa worship; Spiderwoman Theatre; female fandom and professional wrestling; and even (proving that cutting edge scholarship can have a sense of humor) a presentation on "The Boyz in the Hoods: Academy Bashing as 'Popular' Culture" and another on "Discourse and Dat Course: Postcoloniality and Afrocentricity."
Still, with all of this abundance and despite the committee's extraordinary efforts to make a program as inclusive as it is adventurous, the reality of receiving twice the number of proposals that were submitted for our last convention means that this year more members received rejections than ever before. We have heard from people who wonder if they are destined to never give a paper at ASA. We have heard from longtime ASA regulars, turned down for the first time, who worry that they are being pushed aside by younger, hipper critics, while some younger, hipper critics whose proposals did not get accepted are sure that ASA is too stodgy for them and that this is not an organization where their voices will be heard. Growing pains are one of the less appealing aspects of change.
It is not surprising that presently the American Studies AssocIation is experiencing some growing pains. In a recent membership survey, nearly 50 percent of the respondents noted that they had been ASA members for fewer than six years. That is a remarkable statistic that suggests the potential for tremendous dynamism right now. Yet, lest we wax too euphoric over our growth and our youth, it is also important to acknowledge that higher education in general is in one of its worst crises in American history. It is precisely in the complexity of this situation-- growth and decline--that the American Studies Association faces some of its most significant challenges to date.
These are the issues I began to focus on after July 1, when I officially became ASA's twenty-seventh President and began to look ahead to tonight's address. What changes and challenges are we facing as American studies? How can we think about change both theoretically, as an intellectual and social process, and practically, as it relates to our association now, in 1993? All summer long, I kept trying to conceptualize models of change that might be useful for us at this juncture. But it was a strange time■in the Midwest there were terrible floods; in the Southeast, where I live, we were choked by drought. It was hard to conceptualize change as the temperature soared to 105. So, it was at that point that I did what everyone else did this summer. I went to the movies....
It is bigger than a paradigm shift.
Darwinian models of change as evolutionary progress imply academic fights to the death, scholarship red in tooth and claw. By this model, natural selection prevails, and those at the top dominate until they are succeeded by their betters. "Progress" can thereby be seen as both pervasive and irreversible--as, according to Georg Simmel, simply "the laws of history." The result is the arrogance of the new, the easy omnipotence that comes with believing oneself to be the (current) top of the intellectual food chain.
However, and as Steven Jay Gould among others has pointed out, the social Darwinian model of change is dubious on two counts. First, claims of embodying the new order are usually self-deceived. To think of ourselves as autonomous, outside any tradition at all, is to disavow the conditions that allowed and inspired our present existence. Second and concomitantly, claims to being totally new are inherently self defeating since they demean the tradition that will allow our own work to live beyond the next critical mutation that threatens to render our ideas extinct. Yet, current scholarly fashion privileges a Darwinian rhetoric. How many times have we had to read those dismissive labels used to fix that which is "retrograde" simply by naming it: a "typical fifties New Critic," a "typical sixties social historian," a "typical seventies feminist," a "typical eighties multiculturalist"--all such glib labeling of the adversary is, of course, a typical nineties critical move. Yet what is the point? No one--no one--wants to be rendered obsolete. If we embrace a model of change by which ideas are discarded as rapidly as, say, computer software, then we have only ourselves to blame for our imminent obsolescence.
The pitfalls of the Darwinian model of change are precisely, if unintentionally, foregrounded in the movie Jurassic Park when the culminating effort of humanity's most brilliant scientists is the recreation of the dinosaurs. These former kings of the earth are brought back as an entertainment that can be safely (and profitably) theme parked. What could be more demeaning? No wonder the dinosaurs are soon on a rampage. Briefly unextinct, they are determined not only to survive but also to dominate and obliterate their upstart adversaries. If these born-again behemoths were literate, they would be publishing a Jurassic equivalent of the Dartmouth Review. Being only dinosaurs, they must content themselves with hunting humans, in the latter's kitchens and dining rooms even. That the humans are finally, happily saved by the fortuitous entrance of a still-larger predator only warns how perilously survival hangs when it depends on Tyrannosaurus Rex being more interested in attacking something other than you.
This brings me to the main import of the Jurassic Park model of change for our profession right now: Be careful who you call a dinosaur. Velociraptors are still the best hunters. They are back■and hungrier than ever. (Then again, one of the movies previewed with Jurassic Park was Weekend at Bernie's, Part 11■with its memorable motto: "Bernie's Back.... And He's Still Dead.")
In some ways this is the inverse of the Jurassic Park model. In the Line of Fire is also about a "dinosaur"■or so one of the younger Secret Service men terms Frank Horrigan, the old agent played by Clint Eastwood. Frank more sardonically calls himself a"living legend," since he was there the day Kennedy was assassinated and is thus the last surviving agent to let a president of the United States die on his watch.
He also puts down a brash young counterpart with a sagely reflective quip: "You know, there was a time around here when I was almost as arrogant as you." Now Frank knows himself better. I am a "white male piano-playing heterosexual over the age of fifty," he confesses to another Secret Service officer who is young, mouthy, capable, and even female; he thus acknowledges his identity before she can level her charge.
The generational allegory I took away from this movie is comprised of two parts. First, much current academic controversy seen as ideological is really about aging. The implication is that the old days really were "good; nothing in the present measures up; the future is doomed unless the old troopers strap on their guns once more and wisely intervene. That, after all, is what it means to be an old trooper■and I wish all of us the good fortune to live long enough to become one.
Second, there is something comforting in the suggestion of the ongoing circularity of human experience. As opposed to Jurassic Park's metaphors of extinction, In the Line of Fire gives us metaphors of second chances. Thus Frank can still be, in part, an unreconstructed sexist of the fifties, but he is also a man who has mastered more than a few of the lessons of feminism. He cries; he turns to a woman for help he acknowledges that he has "vowed to never again let my career come between me and a woman"■a distinctly contemporary version of "a man's gotta do what a man's gotta do."
In this same context, I cannot resist the nostalgic ploy of remembering my first ASA convention. Russel Nye, already a friend and mentor whose position I had been hired to fill at Michigan State (an impossible task), introduced me to Carl Bode. It was wonderful to spend part of an afternoon with these two genial, wise, generous "Grand Old Men" of American studies. It was also fun to listen to two past presidents express their considerable amusement (even glee) that, in training at least, I represented something new but also something remarkably familiar. I was a multidisciplinary archivist interested in the interconnections between literary texts and their historical contexts. As they explained (and this is not an official history of our association by any means), ASA had been founded by Carl and others partly as a refuge for historicist literary critics professionally marginalized by the ascendancy of New Criticism. They were pleased that they had stuck around long enough to witness the stirrings of historicism all over again. Interestingly, our membership is swelling just as we are seeing New Criticism coming around again, this time touted as something called the "New Aestheticism."
In the Line of Fire offers us another allegory of change as well. Frank's antagonist, a would-be presidential assassin who styles himself "Booth" (after John Wilkes), maintains that "there's no cause left worth fighting for--all that's left is the game." The president himself is little better. He is a poll-obsessed politician surrounded by yes-men and image makers. He believes only in whatever it might take to get reelected. Performativity looms large elsewhere too--Secret Service men running alongside the car not because it is safer but because it is telegenic. It is all self-reflexivity. "They're gonna write a book about us, Frank," Booth says at one point.
The payoff for all this self-reflexivity is personal, not professional. Frank hangs in there as an agent partly because he believes in himself, partly because he is after a second chance. Using an intriguing combination of new skills and old intuitions, Frank redeems himself by protecting his president. He may be a dinosaur, but he is a dinosaur who has adapted brilliantly to the nineties. He ends up a hero in a cynical age that not-so-secretly yearns for heroism--and he gets the beautiful agent too.
The movie's metafilmic musings on how the media shape reality, as well as on how "reality" is staged for the media, also apply to our present professional life. Since the publication of the Pentagon Papers, it has been hard not to be a paranoid. Right now, one particularly wonders who is manipulating the public's perceptions of academia and to what end. When before, in human history, has one become a folk hero as a result of calling someone a "water buffalo"? These are strange times, where "free speech" gets touted as a virtue by precisely the same people who are trying to curb it. George Will, to my knowledge, still has not written a column protesting "free speech" violations in the firing of Cecelia Konchar Farr, the young Americanist at Brigham Young University who lost her job after participating in a prochoice rally. The infamous "left-leaning press" has been highly selective (i.e., conservative) in the speech it champions. No wonder so many professors are currently feeling caught in the line of fire.
This one is mostly for the graduate students in the audience, starting, as it does, with a Harvard Law School graduate's dream of a first job--a salary just under six figures, a house and a Mercedes thrown in for good measure. I hate to be the one to break the bad news, but this is not what you will be offered at MLA or OAH.
Then again, it is a nightmare of a job too. They work you like a dog. They keep you under constant surveillance (the house and the Mercedes are bugged). Your peers look alarmingly alike and spend a lot of their time talking about how great they think the firm really is. When you find out it is all run by and for the mafia, you are warned not to rock the boat and told, if you object, you might not get out alive--all of which is beginning to sound a little bit more like your average, everyday English or history department.
In the barely buried subtext of this movie is a commentary on the culture of competition, on the way we teach students that the only way to score big is by wiping out the opposition. Read the average book review, for example. The author gets the knife, for that is how the reviewer proves how smart he or she is. When the hero of The Firm, Mitch McDeere, meets the big client down in the Grand Caymans, naturally he goes for the jugular. He did not get to the top of his Harvard Law class by being a choir boy.
The Firm has another possible relevance for the American Studies Association as well. You probably should not take a job at any university where everyone is a married, white, heterosexual male--even if you, too (like Mitch), are a married, white, heterosexual male.
Yes,there are three panels on food culture at this year'sconvention. And Like Water for Chocolate--the summer's multilingual, multinational, multicultural hit--signals mass culture's growing awareness of what ASA members increasingly argue: that "American studies" requires reference beyond national borders and that America is not--and has never been--monolingual. This delightful movie reminds us that such issues perennial to American studies as "national character," "exceptionalism," "isolationism," "imperialism," and "the frontier" are all challenged by a border-lands definition of "America." Much of what has been claimed as "American" is really a new world, colonial, and postcolonial phenomenon. American exceptionalism is often exceptionally ignorant of anything other than the Puritan tradition founded on Plymouth Rock. The South? The Spanish? The French? The Southwest? Canada? The Caribbean? Everything south of the (current) Mexican border? The map of the New World has been reconfigured many times--and often at enormous human cost--yet the definition of "America" within the academy has been remarkably rigid, unitary, and exclusionary for reasons that are institutionally explicable but theoretically indefensible. To generalize about America on the basis of Puritanism and those white scholars (mostly at the Ivy League schools) who pioneered Puritan studies is to promulgate exactly the racism endemic to the United States's current hysteria over immigration, the demographic shift to non-English-speaking and nonwhite populations. In short, the instantiation of an exclusionary regional, linguistic, ethnic model of "America" continues to the present but has roots in the earliest definition of what counts as America. American studies cannot afford to wear such blinders any longer. Postcolonialism is the theory; inter-American studies is the practice.
Can a hip young black guy really learn fealty to a tradition-minded old white guy thanks to the Japanese? That is the crucial question in this flick. Far from being a "Japan-bashing" film, Rising Sun reverentially describes and deploys such traditional Japanese values as personal loyalty (particularly the sempai-kohai relationship of senior to junior, teacher to student, mentor to acolyte) and giri (the practice of reciprocal responsibility, the idea that honor requires one to pay back what one has received). In Japan, every gift is also, implicitly, an obligation; every present presupposes a return present. Structured reciprocity--a socially regulated equalizing of relationships--is one feature of Japan that, I confess, I find extremely appealing.
But applied to academe in 1993, the sempai-kohai tradition also illuminates an important rupture. Before affirmative action reforms mandated national searches, open applications, and at least the appearance of fairness, university hiring often involved some individualist, American version of sempai-kohai. A senior American scholar with a good dissertation student could call another such scholar with a job and a hiring was accomplished. The giri of the situation of course meant that the first scholar would then, if occasion arose, hire the second scholar's student, and both students were expected to remain loyal to both professors throughout their working years. Fealty, under this system, was not hard to come by. It was a product of power. However, few of us like to admit that we are dealt with according to our power; instead, we prefer to perceive loyalty, honor, and respect as our due, as forms of approval, admiration-- dare I say it?--love.
I hasten to add that this old-boy model of the profession is partly mythical. I have heard enough stories from my elders about tits-for-tats that did not add up. But never mind. We all know myths are often more powerful than statistics or facts. Still, it is safe to say that affirmative action legislation and the so-called "democratization" of university politics did dramatically change hiring practices in the seventies and eighties. One can still exert power, certainly, but it requires a whole lot more work--coalition building, trading favors ("I'll vote for your candidate if you vote for mine ), working through committees. Equally relevant, a more diffused process breeds more diffused gratitude. Does one owe one's job to a mentor-sempai or to a hiring committee moved mainly by the intrinsic merits set forth on one's vita? It is always tempting to claim full and sole responsibility for your own success, to not owe anything to anybody.
Under this new dispensation of academic credit, older members of the profession especially can feel deprived, cheated even. Adoring young scholars seldom sit at their feet. The loyalty, admiration, and respect that were supposed to crown their declining years are in remarkably short supply. They are regularly represented as the worst possible Other: the not-quite-dead, white, male--racist, sexist, homophobic, reactionary.
There is something fundamentally unfair in all of this, even when the charges are substantially true. In the good old days, everyone knew who the bastards were, but, power being what it was, you also knew when and how to keep your personal opinion to yourself. In a system where power is diffuse, there is no need to disguise contempt, and there is often a certain credit in claiming it. Think about it: the same person who never dared accuse his elders of being tyrants can now be so accused by everyone--regardless of whether he is or is not. No wonder he cherishes memories of times past and joins conservative organizations that promise to turn back the academic clock--and thereby elicits still more jeers from his juniors.
At the same time, many of those juniors also think wistfully back to a time when one's professional life was not consumed by both endless committee work and increased pressure towards professionalization. Many universities now require not one but two books for tenure, and a third for full professorship.
With economic retrenchment, retirements far outnumber hirings, so everyone has more and more students. Furthermore, Americanists in the United States, and especially in the fields of literature and history, tend to be the work horses of their departments, with disproportionately more students than colleagues in other areas or in other fields. This results in envy on all sides; an Americanist with twenty dissertation students feels beleaguered and overworked while the colleague in medieval studies with no dissertation students objects to the Americanist's purported departmental "power" and wants to restructure the graduate requirements (typically to require more courses in medieval studies).
And yet the point remains; we are exhausted. Even were one so inclined, who has time for sitting at the feet of the masters? Many of the would-be sempai look at the younger generation and see arrogant and self-promoting ideologues. Meanwhile, the erstwhile kohai, burnt out before they reach forty, are wondering what in the world ever happened to that ivory tower where you could read great books and think great thoughts? At the same time, state legislators, spurred on by the right wing, complain that we teach only two, three, or four courses a term and wonder why we should be paid so much for working so little. Of course, teaching is the real point of the whole profession. But how do you devote your energies to that task when the tenure committee is waiting for your second book, when you are serving on twenty dissertation committees, and also have weekly commitments to the women's studies program, the African-American program, the sexual harassment committee, the affirmative action committee, the Black Students Association--not to mention such normal duties as departmental meetings, hiring committees, curriculum committees, graduate admissions committees, and on and on and on? Sempai. Kohai. We are all in a pretty thick stew right now.
Now this one is really about change, a change that is particularly striking in the American Studies Association. I am talking about gender. Like Orlando, we have made a rather dramatic transition from male to female. Although only six out of twenty-seven ASA Presidents have been female, all six of these women (including yours truly) have been elected in the past seven years. But, because this shift is so recent, there is considerable uncertainty about what gender switching itself might signify. Is American studies being dangerously feminized, as was the secretarial profession a century ago, and can we all look forward to increasingly lower salaries and status? Or has American studies finally become a little fairer--even to the point of reverse-reverse discrimination? Coincidentally, a few hours ago, I happened upon an informal caucus of women scholars who were trying to come up with ways to give men more visibility in their organization. It was quite touching, really.
Yet let us not become too besotted with symbolic victories. I also regularly hear the bitterness expressed by many women of my generation or older who have made tremendous personal sacrifices to establish themselves in a profession whose tenure timetable took no account of their biological clocks. A male peer can have a much more successful midlife crisis, divorce the wife of many years who remembers him as a lowly assistant professor, marry an initially adoring graduate student, and start a second family--beyond the sexual peak perhaps but still firmly on the career one. Sometimes, Orlando notwithstanding, anatomy does seem like destiny, especially when the rewards of the academy, supposedly based on the universal of "merit," turn out to be freighted with an implicit gender system differentially apportioning support and sacrifice.
There are other nuances to recent gender "transformations" as well. A friend, an African-American female professor, remembers how when she took a new position a few years ago, everyone was perfectly pleasant, but, during her whole first year, she was invited to coffee or lunch only some four or five times by her white colleagues. Now, her department has just hired an African-American male professor. Two weeks into the term and not wanting him to experience what she endured, she called him up to suggest an impromptu lunch and was bemused (or maybe furious is the better word) to find that they could not meet for almost a month. He was being wined and dined virtually every day. Yes, Virginia, there still is sexism in the academy.
A Shakespearean title for the canon debates.
Regularly the media tell us that what we are doing now is "trivial" because (they say) we are no longer studying the classics, the real stuff, instead. "Alice Walker is being taught more often than Shakespeare," is the canonical battle cry, despite the numerous surveys that show, indisputably, that this statement is groundless. Shakespeare is emphatically still being taught. Stuff versus fluff? What the canon debates really mean is that the department's "Samuel Daniel man" (or his equivalent) may have been replaced by a multicultural, multilingual, multidisciplinary generalist who has students write border-crossing personal journals instead of essays on obscure sonnet sequences. Not necessarily a bad change, if you ask me.
But sometimes it is a costly change. What no one points out in the canon debates is how easy the Samuel Daniel man had it. He could specialize in and make a respectable career writing exclusively on a lesser Elizabethan poet (or the equivalent in other disciplines). I remember well the satisfactions of thinking, one day in 1985, that I probably knew more about novel reading, writing, and publishing in 1790s Massachusetts than just about anyone else on the planet. It felt good. I could teach a graduate seminar on early American fiction at the drop of a hat- -and, if I wanted, I could have taught the same seminar for the next forty years.
It has been almost a decade now since I could comfortably assume that I knew more about the subject at hand than any graduate student in my doctoral seminars. I offered a course, recently, on technology and literary form. My students all wanted to talk about cyberpunk fiction and had to provide me with titles, homework assignments even. At Duke, we have recently begun an exciting cross-disciplinary, multinational, multi-lingual "Seminar on the Americas" program that has me reading everything from Belizean fiction to political analyses of Cree land and resource rights in Quebec. Next term, I will teach a seminar on the history and metaphysics of photography. Someone in that class is going to know what an f-stop is and how it works--and it will not be the professor.
In short, taking on new subjects is time consuming as well as risky. Your limitations as a teacher are regularly exposed. Yet, it is also exhilarating and fosters a new kind of collaborative pedagogy in which students must take responsibility for much of the research, reading, and class discussion. The teacher is not the sole arbiter of taste or truth.
But back to Shakespeare and Much Ado. Remember that Beatrice and Benedick at least had fun as they argued and insulted their way toward marriage. The canon debates have not been fun. They have been mind-numbingly dreary, largely because they have been posed in such simplistic terms. The canon crisis should be leading us to exciting explorations of enormously interesting questions--about periodization, nationalism, identity politics and about how we define and organize knowledge within the contemporary academy. Instead, what passes for curriculum debate in most universities today is scorched-earth guerilla warfare--obliterate everything that is part of any field other than your own. The model is not "political correctness"-- whatever that silly term really means. It is political stupidity. The general public reads about the so-called culture wars and wishes a plague on all our houses.
To compare the divisions in our profession to those in American cities would trivialize our nation's most significant and tragic social problem--racism, with its attendant poverty. Menace II Society reminds us, painfully, of hopes and dreams dashed, of the consequent violence and desperation that make the inner cities of the world's most affluent nation into places as bleak and dangerous as any on earth. Yet, we can still draw a moral or a model from this disturbing movie. When people are largely powerless, locked into what they do not want, deemed disposable, they often turn their violence on one another instead of on the forces that perpetuate their condition.
I am not predicting graduate-student-drive-by shootings or the torching of convention sites. Our graduate students, however, do face one of the very worst job markets since World War II. On every level, from the shoe-string community college to the prestige research university, there are cut backs and retrenchments. The end of a mandatory retirement age also portends a bleak future for the graduate students of 1993. It is heartbreaking.
And what are we, especially those of us already established in our profession, doing? Are we effectively advocating increased public and private support for education? Are we making clear to the general public just what we do and how well American education actually works? Are we trying to find ways to curtail some of the soaring costs so that higher education remains a possibility for most of America's citizens?
Mostly we are not. Instead we are engaged in senseless internal squabbling. With so many real crises, it is pathetic to think of so much brain power being wasted on the political correctness debate.
It is all a shell game. While we are bickering over which hand conceals the pea, the barker is robbing us blind. And I mean us. Look at the way the media portray higher education all of higher education, conservative and liberal, it does not matter. We used to be just a bunch of harmless and irrelevant eggheads. Now we are perceived as vituperative and mean spirited, more concerned with exploiting some square inch of intellectual turf than in preserving education.
Admittedly, the right-wing attacks on academe are debilitating. But I am not willing to say that we are powerless to change our situation now. I must admit that I used to take some solace from those two profound expressions of anthropomorphic wisdom, "never get in a pissing match with a skunk" and, its variant, "never wrassle with a pig in mud. You only get dirty--and the pig likes it." True enough! But the skunks and the pigs are setting the agenda these days. We all, every one of us, need to be fighting back on every level from the most individual to the institutional. It is tedious, but no more tedious than the futile angst of perceived powerlessness.
Even on a micro level, there are things we can all--every one of us--do. A few years ago, several of us at Duke found ourselves the focus of obsessive right-wing attacks, not only publicly and vituperatively in the media but also in the form of personal hate mail and even thinly concealed threats of violence. My initial impulse was depression, a desire to run away, but then someone decided we should form a pizza-and-beer group at which we could pen rebuttals to some of the more outrageous lies being written about us. Not all of our beery responses were published, but the collective act of self-defense and community solidarity got us through a bleak time and reminded us of how much we liked each other and why we got in this profession in the first place.
This brings me, in fact, to the ninth or the Fugitive, model of change: it is the best allegory I know of what happens when you stop running away--and start fighting back.
Happily, I need not conclude with a jeremiad. The last film I saw this summer was Robert Townsend's zany Meteor Man, a movie about a mild-mannered, inner-city teacher who is hit by a meteorite and takes on a new identity. Suddenly endowed with Schwarzeneggeresque muscles; x-ray vision; and the ability to fly, to catch bullets, even to talk to dogs; Meteor Man uses his new comic-book-hero talents to rid his Washington ghetto of its most vicious street gang. Oh, what a lovely fantasy!
What is intriguing to me is that, even in a designedly silly movie such as Meteor Man, the most serious issues of our society are recognized. I decided to go to the movies this summer partly as a way to avoid working on this presidential address. Of course, what I discovered is that America at large is thinking about the same issues, the same crises that we are facing as an organization. Canon wars? Affirmative action? Generational conflict? Class, race, and gender resentment? As any good student of American studies would have guessed, all of these issues are out there in mass culture. Indeed, it is odd that universities are condemned for their modest moves towards so-called political correctness when, throughout the culture, various groups are working mightily to keep America's recurrent demons at bay by aiming toward PC--by which I mean political civility.
As I have tried to suggest tonight, the consequences of change are rarely easy--yet, given the inevitability of change, failure to adapt to a changing world is as disastrous as failing to recognize what our world owes (for good or ill) to the past. We can theorize the changes taking place in our association and in American academic culture in general, but it seems wrong to either celebrate change or deplore it since the one thing we can count on is that there will be further changes to come. Change itself changes. That, of course, is the implicit message of our convention topic "Cultural Transformations/Countering Traditions." The uneasy but ideally productive interaction between transformation and tradition, culture and counterculture is what makes life--including intellectual life--exciting. And the best members of this association--and I think of Russ Nye and Carl Bode, of course--have always been able to embrace this kind of change with exuberant grace.
Rather than an evolutionary, devolutionary, or even revolutionary model of change--one that defines change monolithically as a cause for celebration or mourning--I would like to end tonight by positing an ideal of "loose change"-- inconsistent, multivalent, uneven, unstable, indeterminate. Change of this kind is not positive or negative in an absolute sense but is always and necessarily good for some and bad for others or, even more accurately, is good for some people some of the time and bad for others at other times (and it is often unclear which is which except in retrospect).
Speaking personally, I am comfortable with a concept of inconclusive change, change as variable, mobile. And I am proud that the American Studies Association has always been a place where dissent and difference lead to change, not one in which change leads only to dissent and difference. That is, I ahve no desire for an organization in which there is a bland consensus, in which opposition in harmonized away. But equally important, I find the culture of permanent complaint boring, counterproductive, and ultimately supportive of a tunnel vision of the "good old days" that is nothing more than chimerical and nostalgic.
At the end of the movie version of Much Ado About Nothing, all of the false accusations, mistaken identities, unwarranted violence, and petty feuds have been settled and put to rest. The real villains end up behind bars and small disagreements among the other players can be put aside temporarily in order that everyone can rejoice for the few mythical days that make up the final feast in a comedy or, for that matter, an annual convention: "Come, come," as Benedick says,"we are friends."
Short of friendship, let us at least aspire to dialogue and civility.
As the bard also sayeth: "We'll have dancing afterward."
I would like to acknowledge the generosity of the Rockefeller Foundation which awarded me a residency at the Bellagio Study and Conference Center, where most of this address was written.Colleagues at Villa Serbelloni (particularly Robert Solomon and Harald Atmanspacher) offered invaluable comments on a draft of this talk, as did Alice Kaplan; Emory Elliott; Marianna Torgovnick; and, most especially, Arnold E. Davidson.