Sunday, November 15, 2015

a gathering

by J J Cohen

Dispatch from the front lines of BABEL. Read the post. Support the movement.

Sitting in a coffee shop while daughter is at Hebrew school. I'm just back from a few days in Dallas, where I was welcomed into a rare kind of intellectual community: a gathering of undergraduates, grad students, and professors into difficult conversation, the humanities at their best. This vibrant collective owes everything to Jeremy DuQuesnay Adams and Bonnie Wheeler, and the open doors of their home.

Bonnie and Jeremy are the sweetest couple. Being with them for two intense days made me feel a part of the kind of family I dreamt about as a child, where love is palpable and intellect and humor are hard to tell apart. I always thought college would be like this, but had quietly given up on that kind of gathering until I visited SMU. I hope Bonnie and Jeremy won't mind my sharing this picture from my time with them. The image is blurred because I could not resist breathing awwww as I took the shot, and thereby accidentally moved my hand.

In another photo of Bonnie, displayed in her hallway, she stands in a white dress with long flowing hair and amazing white glasses. Both her hands are upraised: she fills the frame and makes it clear no one will pass her. She is wearing a white shawl and appears as a beautiful yet absolutely commanding Flower Child. I love the image for what it conveys about Bonnie's spirit. Here is what I once wrote about Bonnie:
Few have had such a positive impact upon Medieval Studies as Bonnie Wheeler. Her special mission throughout her career has been the cultivation of young scholars, fostering their intellectual growth through mentoring and their professional possibilities through guiding their research. She is and always has been an inspiration, a catalyst for change, and a fairy godmother for those working on nontraditional projects. 
The photo was taken at a time in the academy when people like Bonnie were so necessary ... and yet persecuted for not conforming to a severely limited academic type (especially for women). Bonnie has searing stories to narrate about medieval studies when she entered the field, but I can summarize their moral: the field is a far better place for women -- for everyone -- because of the shit she challenged and refused to repeat. The feminist fierceness of Bonnie's generation must never be forgotten (even as its focus, assumptions and aims have been, of necessity, widened). Nor should the hostility of the academy towards those who pushed against its status quo. That panic when the academic status quo is threatened continues.

During an afternoon tea that gathered the community in Jeremy and Bonnie's home, we had a wide ranging conversation about micro-aggressions and the "feel" of college campuses. Some people did not know what micro-aggressions are, others saw their invocation as a threat to free speech, but everyone -- even when anger bubbled -- listened with patience, took time thinking through uncomfortable realizations. Bonnie spoke eloquently about the dangers of the forced choice of thinking that college is either a space of challenge or a safe home (it is both). She also spoke about how second wave feminism accomplished its activism, emphasizing the importance of consciousness raising and the face to face encounter.

The previous day I gave a talk to an energetic audience about Noah's ark and the stories we tell about climate change. The first question was about Levinas and the faces of the dead animals in the manuscript illustrations I showed. The question took a little while to be formulated, and some people in the audience became impatient, but it can take time to get things right, to be clear and complex at once. The humanities are not so great at providing answers, at least not easy answers. Yet they excel at teaching us to ask better questions. That takes some fumbling, some time, some willingness to stay with someone as they figure things out.

I left the tea briefly to chat with a graduate student about her thesis project. When I came back, everyone had gone upstairs to watch the news and behold in helpless fascination the attacks in Paris. A concert, a soccer game, restaurants and cafés. The terrorism had targeted the young.

Earlier in the day I'd visited the Texas School Book Depository where Lee Harvey Oswald built his sniper's nest. The exhibit there is good, and thorough, emphasizing what JFK accomplished -- as well as how many groups hated him for it (one reason conspiracy theories flourished immediately is that so many wanted him dead). The Peace Corps, the NEH, the NEA, the space program, expansion of voting rights, integration of public schools, and the formulation of a comprehensive federal civil rights bill are all his: art with science, humanities with being humane, an emphasis throughout on making the nation better for all who follow even if that means triggering massive discomfort and challenging the status quo. It's sad to think that 50 years later so much of the social justice movement put into place by Kennedy -- and his valuing of the arts and humanities -- have been eroded rather than intensified. A pillage economy has replaced a push for shared equality, gun rights over civil rights, plush retirement accounts for some rather than economic justice for all. It is very difficult to be young right now. When I look at the future imagined in 1963 and the one we live with at the moment, I wonder how we learned to be OK with its attenuation.

Except not everyone in that "we" is OK with constrictions of possibility. I place great hope that student protests at Missouri, Georgetown, Yale and elsewhere are getting things right. The generations that preceded them convinced themselves to perpetuate an economy, a justice system, a world of arts and humanities and science that gets so many things wrong -- that passes along a small future, modes of thriving reserved for a privileged few, instead of opening doors wider, increasing access, imagining the world otherwise.

This nation values money rather than people, increasing what some few have rather than increasing access to that having.

How did anyone come to believe that a valid assessment for any particular course of college study (say, in English or History versus Computer Science) is how much money those holding a credential earn in the course of a career? According to this metric, embraced by both US parties, the more money you earn the more highly your degree is to be ranked as effective. It's that blunt. Screw such appraisal and the assumptions that came with it. Any philosophy major can tell you that the correlation between lifetime salary and lifetime happiness is not one to one. Beyond a certain minimum threshold that enables a person to obtain the food, shelter and comforts of a modest life [a minimum that ALL people merit], more money does not yield more satisfaction with one's life. Nor is a life with tremendous cash flow and a well stocked retirement portfolio a life better lived. I hope more universities and colleges pressure those who would "run a university like a tech company" (as Timothy M. Wolfe disastrously promised to do at the University of Missouri) to realize that they need to sign up for a remedial course of education in the liberal arts, where they might learn a little better what institutions of higher education at their best accomplish.

What if a reply to terror were art? What if an answer to violence were acts of imagining a world otherwise? What if in response to hatred we refused the invitation to hate?

Here's an old piece I wrote about Paris and complicated histories of race and violence. I was thinking about that blog post recently because it's included as an excursus in my most recent book and I was asked to comment on it during a graduate seminar at SMU where the students had read Stone. I declined. Oddly I cannot talk about the piece without tearing up, even after all this time, and I just did not want to have to wipe at my eyes during that conversation. The violence in Paris happened the next day and I sat with the same people glued to a TV set in Dallas watching images in horror and welling up.

I believe that the Zombie Apocalypse arrived long ago, in the form of a small number of well educated, mainly white and affluent Americans who consumed as much as they could grab and gave futurity little thought. They negated or reversed so much of what seemed possible in 1963 and 1968, so much of that vision of a more just future. Their gated communities (a version of Noah's ark) now exclude those who are not thriving. They have seized control of many universities (among so many other things). They do not ask: how can we expand access? How can we open doors in welcome, rather than construct such massive walls? They did not follow the path that Jeremy and Bonnie, social activists both, insisted was the groundwork of more capacious and more equitable community. They turned away from the utopian visions many of them had embraced when they were younger.

Utopia does not vanish when it is no longer by one generation dreamt. Young people -- including my students, current and former -- are much smarter than some of these ark-dwellers suppose. Today's young will not long remain complacent or compliant. I believe they will refuse the legacies of racism, misogyny, rampant militarism, economic disparity, environmental injustice and homophobia being handed them as if natural or inevitable or the way things simply are. 

I revere the visions of a more just, more equal nation that are a rightful inheritance, a world for which so many have for so long labored.  I have great faith in the future a new generation of young activists will make. 


Friday, November 13, 2015

DISPATCH from the Front Lines of BABEL: MLA Subconference, Steering Comm. Elections & More!

Dear BABEL-ers and Friends of BABEL,

This is a quick message from your friendly neighborhood BABEL Steering Committee about three important items. (If the phrase “BABEL Steering Committee” isn’t ringing any bells, check out our self-introduction from about a year ago, available here).

FIRST, we want again to thank everyone who helped make October’s BABEL conference at the University of Toronto such a delight. If you were there, we encourage you (please!) to take a few minutes to fill out our follow-up survey so that we can build an even more magnificent conference next time.

SECOND, we want to draw your attention to an event being supported by BABEL, punctum books, and Studium (a co-disciplinary space for the arts and humanities in east Austin, Texas). This event is the Third Annual MLA Subconference, which takes place just prior to the enormous Modern Language Association conference each January, and which seeks “to confront the (labor) crisis in the humanities head-on” and support organizing among “those impacted most: adjuncts, graduate students, university food and service workers, labor organizers, and activists working within communities affected by university-driven gentrification.” The Subconference is currently gathering donations to help defray travel expenses. As the organizers write:  
We'll [have] as many as 200 participants this January, but we need your help in getting the Subcon organizers and presenters (primarily grad students, non-grad adjuncts, and community organizers) there. 

We've received about $1000 in institutional contributions so far, but with roundtrip airfare running between $225-$500 from most points in the U.S. and Canada -- and with seven main organizers, and as many as 24 presenters, all coming specifically for the Subcon—we need your support. Help us expand the Subcon network and build power and autonomy among contingent workers and allies in higher ed. 

Your contribution will go toward:
  • Roundtrip fares for plane, train, and bus travel; 
  • Mileage reimbursement for those driving; 
  • A limited number of hotel rooms (we plan to provide home-stays for most participants); and/or 
  • Inter-Austin public transportation.
We encourage EVERYone to make donations as you are able, and you can do so here. At this present moment, when we witness the power of campus activism all around us, donating to the Subconference is a way to support those already fighting for a more just future for the academy. We’re proud to say that BABEL has made a substantial donation of $500 to the Subconference, using a portion of the funds left over from October’s conference. Go, Subconference, go!

THIRD and finally, the BABEL Steering Committee will soon be seeking nominations for NEW COMMITTEE MEMBERS. As four of our twelve members rotate off, four more will be elected by the corporate body of BABEL (i.e., you! and anyone who wants to be you! that's everyone!), from a pool of nominations (also generated by *you*). More information is on its way within the next month about the nomination and election process, but in the meantime, we want to encourage you to start imagining which BABEL-associated wunderkinds -- whether academics, non-academics, para-academics, artists, activists, grad students, adjuncts, lecturers, performers, editors, poets, peripatetics, or what-have-you -- might help steer the BABEL ship. More about that anon!

Otherwise, the BABEL Steering Committee hopes you are enjoying either an incandescent or else a most riotous November.

The BABEL Steering Committee

Monday, November 09, 2015

“Shakestime” (On Method)

by Julian Yates

[Julian presented this piece recently at the Folger. I've been collaborating with him recently for a forthcoming punctum book, Object Oriented Environs: you may read our introduction here. We're honored to share this wonderful meditation on method here at ITM -- JJC]

The following post reproduces the paper I gave in the penultimate session at the highly stimulating Fall Weekend Symposium devoted to “Periodization 2.0” (November 5-7 2015) at the Folger Shakespeare Library’s Institute, graciously and expertly organized and hosted by Professor Kristen Poole of the University of Delaware and Owen Williams of the Folger Institute.

I arrived at the symposium with one paper, but the conversation over the two days provoked me to write a new one that reflected on my own training. This trip down one of memory’s lanes led me to articulate the underlying methodology of much of my work as a scholar thus far in my career. I am deeply grateful for the occasion, the papers by fellow speakers, the genial conversation, and the provocation.

I am grateful to ITM for the invitation to post the paper here—I hope that you find it interesting!

“Shakestime” (On Method)

I thought it might be useful at this point in our conversations if I prefaced our entrance to the peculiar temporal or spatial variety that shall have been Shakestime by trying to Polly parrot back to you some of the things I think I have heard over the last two days. I do not have time to do this responsibly, to name names and give credit where credit is due. I apologize in advance as I mischaracterize your contributions—they come back to you from someone who in 1993 or so (that’s the year 5753, of course, in another calendar, also coincidentally the year the Toronto Blue Jays won the World series) embarked on a reading of the philosopher of science Michel Serres with the result that all writers became my contemporary.[1] This didn’t mean very much because all the word “contemporary” meant was that like you and me these writers were or had at one time been finite beings; had to deal with time as they experienced it and as it was mediated for them by their object world or ecological milieu. Just like us, just like King Lear, they were not “weatherproof” and so were buffeted by the eventfulness of this world; subject to physical distress, joy, loss, affective or emotional turbulence. Just like us they created material and discursive shelters, external memory devices, objects (which include texts) some of which remain today in various states of disrepair; some of which survive at the expense of others or do so by the occluded and now invisible labor of a host of animal subjects human and otherwise. Periods are shelters. They provide shelter against the irreversible time of physics.[2]

This contemporaneity—or shared exposure to time (le temps) and to the weather (le temps) meant, so I learned, that in order not to do violence to these writers and their objects, to misconstrue them by marshaling them to the latest interpretive schema, I had to seek “to know them from the position of the known.”[3] I had to work inductively. I had to read without a meta-language; do without symptomatic reading strategies; and avoid synecdoche like the plague. Sociology, anthropology or any explanatory set of terms and paradigms knew no more and no less than the texts they set out to study. Marcel Mauss and Lévi Strauss were very smart. But they knew no more than did Molière who was also very smart. Theater was already a social laboratory that would become, courtesy of the Royal Society, a theater of scientific demonstration.[4] Writing was projective, experimental, and future oriented. Different media interpenetrated and anticipated each other. Moses was a switchboard operator, one in a long line of telephonic intermediaries.

Periodization, to the extent that it was an issue, became a problem of grammar or syntax. Where do you like to end your sentences and begin others? How were you going to emplot texts you declared past so as to produce certain kinds of time effects that would do rhetorical work in what you took to be your present. To periodize was to calibrate, to become the writer of a new text or the producer of a new object, and so to refold the remains of the past to particular ends. To periodize was to delimit. It was violent and so was in some shape or form to be resisted. To declare, for example, that Periodization 2.0 marked the beginning or the end of a Renaissance at the Folger or was a Restoration of sorts was essentially to declare war on a competing narrative or settlement. This declaration enabled you to mark a break with an intervening middle time of benighted bathypelagic sensory deprivation (ages you made “dark”) under the sign of your rebirthed continuity with or reclamation of glories past re-clothed or re-embodied in the likes of, surprise-surprise, oh look it’s us.

Epistemic breaks were right out. They might be posited but only as heuristics or propositions and they were not particularly interesting because they were about reducing the complexity of the noise of things (past, present, and future). They were dangerous because they might all too easily turn into fixed points and reveal themselves to be on ramps to all too familiar intellectual superhighways.

Synchrony versus diachrony was a false choice because timer was multiple, discontinuous, and not linear. It does not flow. It percolates.[5] Different configurations of matter, different objects (the First Folio, a ruff, the Holy Cross Guild chapel, a musical phrase or style) obey different chronologies. They are asynchronous but might also synch up. Things do change. But nothing disappeared. Secularism, for example, was merely a differently religious way of re-tying the knot of belief and belonging. The word religion was to be understood according to a strict Latinism that reminded you that re-ligere means to re-tie. Religion therefore designated all the various processes or routines by which we declare ourselves “fit to be tied,” the narratives, communities to which you claimed to belong thereby authorizing your otherwise irrational, even unforgivable, decision to cut yourself off from other creatures.[6]

Language was about vectoring. Nouns and verbs were less important than prepositions (the pre-placing or positioning of things) or deictic markers. There was no better word or place to be than “between” or, if you like, in the middle.[7] The program, then, was to try to craft something on the order of a “general theory of relations” or poetics of translation or metaphor, a new kinematic aesthetic—how do ideas and things move; where do they go when they seem to vanish; where and how do they come back?[8] Nominalist or inductive forms of historical epistemology; tracing words and gestures; postures; signal tracking, traveling by the turns of a trope, by the force or blow to a figure that impresses itself, were the way to go, but they were not ends in themselves. They were merely translation tools, means of transport, that enabled you to learn how to refold texts that seemed to be separated by vast chronological distances so as to see their similarities, their proximal relations, proximity, and yes, on occasion, their isomorphism.

Time was an effect of space—best metaphorized as a giant hankie that could be unfolded to maximize the distance between two points or folded over and crumpled in order to make two points coincide.[9] Time did not exist without objects; and those objects served as translational relays, crossroads, anchoring points, convocations, which folded together the differently timed remains of persons, animals, plants, and all the various entities that make up our built worlds.[10] They might be palimpsests as Jonathan Gil Harris has argued, but the word overlay seemed more neutral. Objects accreted their uses; abuse, accidents, and decay.[11] Certain objects seemed especially stable—the Eucharist, money, the cell form of the commodity, Shakespeare (sort of) which is to say they spliced together matter, signs, and flesh in ways that could direct or route traffic. All that traffic (repetition as difference) kept them stable.[12]

Agency became annoyingly easy to talk about. Objects or “quasi-objects” didn’t have it; but neither did “quasi-subjects.” Instead they both participated in its production. Subject and object were grammatical positions first and foremost and could be occupied, from moment to moment, by different entities, human and otherwise. This was all best explained by watching rugby, but baseball would work too. For to any unbiased observer the ball was obviously the true subject, giving agency to whoever had it or made it do what was required.[13] The whole subject / object problem was best left well alone and handled by speaking of ties, ligatures, and the way our worlds stack animal and vegetable labor to produce forms of life that confuse all these categories strategically. The only thing you could ultimately say about people is that they were parasites. And, maybe with a lot of hard work we might be able to achieve something on the order of a mutually sustaining relation to our world. Perhaps the parasitic relation that kills might be stabilized or transformed into a symbiosis. But, let’s face it, the Holocene (entirely recent time) if not yet the Anthropocene, hadn’t gone very well so far.[14]

As you can imagine, as a graduate student back in 1993, this all came as a bit of a shock. New historicism (already flagging) looked really strange. It’s synchronic slicing up of things past and bewitching use of synecdoche let you know that it was a powerful mode of translation, a powerful topological operation. Best to steer well clear. So, as a first step, I decided to write a dissertation that, as I would be frequently reminded, had no real literature in it. It’s title, “Cunning Conveyaunce: Space, Narrative and Material Culture in Renaissance England,” let you know that I was very modestly just out to track the peculiar lexical flexibility of the words “cunning conveyaunce” and their compeers “curious contrivance” in describing certain contemporaneous quasi-technological devices in t: portrait miniatures, relics, flush toilets, the printed page, and priest-holes (secret hiding places for books, massing stuff, or priests, a technology that essentially enabled houses to forget). The texts I chose derived from my signal tracking which was, in part, performed by way of a chronological short title catalogue search with a card catalogue; and then reading within the disciplines that claimed expertise for the texts I tried to read. My “period” ran from 1570 (the founding of the Jesuit Mission to reconvert a reformed England) to 1606 (the aftermath of Gunpowder Plot). Antiquarian labor of the early to mid-twentieth century or “fetish labor,” as I like to think of it (and positively so), proved crucial to this endeavor because the labor we have to do now to approach texts and objects always proves reciprocal to the labor they did back then to make and use whatever text or object you are out to understand.[15]

More peculiarly still, as I reckoned with where my reading of Serres and then the sociologist Bruno Latour had taken me, I discovered that I had not been turned into a philosopher or a historical sociologist but had been re-territorialized in questions of media, form, genre, and trope, understood as ways of trying to understand the messy business we call poiesis, making things and the status of the things that we have made and that make us. This was a happy outcome because deep down I was trained as a formalist, had always thought that formal analysis, close reading, narrating the scene of encounter with an object, aesthetics understood as an account of perception, pretty darned inductive—a laboratory of sorts. It also meant that I could finally understand that the lesson of Derridean deconstruction was not a cautionary tale on the irresponsibility of a maximum entropy formalism but a radical empiricism that sought to stop the noumenal or heuristic positing of categories becoming realist by exposing it to the noise that it had sought to filter out but without which there would be no signal to track or to name. That noise was potentially always a set of signals from another differently timed object, echoes of excluded voices, forgotten or invisible labor, human and otherwise. Close reading, deconstructive reading, attempted to hold open the bounded period of the sentence, “the structure of the sentence to the saying,” so that it may be said differently.[16]

What is Shakestime? Who or what, for that matter, is Shakespeare and how is it that he may defy periodization? I have a couple of answers. You could, for example, describe Shakespeare as “a proliferating knot of times and places, a translational node or quasi-object.” “Shakespeare” is an assemblage or activity, a chain of making, whose performance produces an evolving collective of texts, readers, readings, persons, performances, and audiences.[17] More contentiously, if you are a bit fed up with that and want, in Dipesh Chakrabarty’s terms, “provincialize” Shakespeare, you might be inclined to re-describe “Shakespeare” not as a now defunct historical person or a series of plays but as:
a mobile, conflicting, conflicted, and partial time-bound set of practices. What happens if we proceed on the assumption that the academic designation ‘Shakespeare studies,’ as well as school curricula, professional Shakespeare theaters, the film industry, media libraries (on and offline) refer not to a series of agreed upon texts or performances but instead to a series of differently distributed fetish communities, each of which tunes itself to the shifting auratics of its chosen ritual objects as they are variously mediated—from manuscript to quarto to folio, on and off and back to the stage, the movie theater, and the home entertainment system—the ontology of the thing we study ‘Shakespeare’ [or Shakestime] waxing and waning, constantly picking up and dropping actants as it goes. The distribution of readers into different fields of study (performance, theater history, criticism, theater production, and so on) would constitutes not a happy holism, but a series of discontinuous and only sometimes intersecting conversations or crowds that converge on variously mediatized forms of Shakespearean texts. The Shakespeare industry, so it turns out, would refer not merely to an elaborated infrastructure, but to the industry of so very many readers and purveyors, whose vital juices the Bard requires to keep on flowing. In this model, the labor of all such fetishists (myself included) stands in reciprocal relation to the past labors of reading, living, and dying that our work posits as ‘past.’”[18]
If you want to imagine something different, if you want to calibrate the past differently, to imagine other more capacious periodizing strategies, you might need to stop reading Shakespeare or to direct traffic to another set of texts and objects which would then anchor your sense of time.

Nothing I have said thus far about what “Shakespeare” or “Shakestime” are or might be should be confused with what it entails to open a reading of or encounter a play. Plays are projective. They wish to become something else: a performance, a reading, a new text. They are necessarily incomplete and so must be joined. By joining them we activate and perform their structures and turning space into place by our time-bound occupation of them.[19] That is, in a sense, the lesson of the plays as I read them, whose predicaments usually seem to revolve around characters not quite being when or where they thought they were and asking for help or failing to find any.

Coming last, as York tells the audience early on in Henry VI Part II, means that you get to reap the benefits of comprehending the situation; time is less important than timing.[20] But judging whether you are timely is really difficult—best not attempted but frequently unavoidable. Lady Macbeth ends up stuck in Act 2 scene 2 even though she’s in Act 5—in one of Shakespeare’s contribution to making king killing seem unthinkable even as he still thinks about it.[21] Macbeth and Banquo register the disappearance of the witches in Act 1 scene 3 as a moment of sensory estrangement. Everything that happened; everything they heard and seemed to have been promised; has gone, or worse, never actually was at all. The futures they were offered: Macbeth’s life and reign; Banquo’s genealogical afterlife; never will have been. We watch as the two of them register this loss and reckon with the residue or remainder of their inflated sense of being. The lives and legends the witches suggested to them, and which they just now imagined, have become less than virtual. All that’s left, until Ross arrives and hails Macbeth “Thane of Cawdor,” (1. 3. 103), as if he were some witchy speech bubble gone awry and only now making it back, is the aching abandonment become giggly abreaction that the two men share: “Your children shall be kings / You shall be king” (1. 3. 84). Perhaps it was all just the wind. “Have we eaten on the insane root?” (1. 3. 82).

In A Midsummer Night’s Dream, Bottom wakes up in Act 3 scene 1 even though he’s in Act 4 waiting for a cue that never comes or won’t till Act 5; remembers more of what happened in Act 3 than he’s willing to say out loud—more than the lovers, that’s for sure who are totally unable to explain how everything turned out alright.[22] Eager to periodize, to consummate their marriage, Theseus writes them all off along with poetry, dreams, and madness (5. 1. 2-23). But Hippolyta’s the better signal tracker (had read her Ovid apparently); judges that “all the story of the night told over / And all their minds transfigured so together / More witnesseth than fancy’s images, / And grows to something of great constancy” (5. 1. 25-26). She tropes or trumps the play’s lexicon of translation to register the knot, the tying off, that she names a transfiguration. An end is coming—an end she registers in performance either, happily because consciously, with a wink; or, unhappily, strangely, without acknowledgement, at the moment she redacts Theseus’s frustration with the moon in the opening lines of the play in reference to the stunt moon, Moonshine, “I am aweary of this moon, would he would change” (5. 1. 237).

When am I? When are we? Is it in fact now? Does time progress or does the time of others catch us up and out? These questions seem to capture the flavor of “Shakestime,” unless you’re riding the kairos, immanent to the action, at one with the time, or in Iago’s words, “even now, now, very now.”[23] But that won’t last very long.

I am not sure what time it is. But 2016 or year 5777 of the Holocene is coming. I do not know who shall win the World Series but a predictive weather report might safely offer that things will remain changeable, with a chance of shakes-appearing.

Thank you.


[1] Michel Serres and Bruno Latour, Conversations on Science, Culture, and Time, trans. Roxanne Lapidus (Ann Arbor: University of Michigan Press, 1995), 44-45.

[2] Michel Serres, Hermes: Literature, Science, and Philosophy, trans. Josué Harari and David F. Bell (Baltimore: Johns Hopkins University Press, 1982), 115-116. “History,” adds Serres, “flows around physics” (116).

[3] This use of le temps is fundamental to Serres’s philosophy and a continual reference. But see Serres and Latour, Conversations, 58. On Serres’s strange form of empiricism, see Bruno Latour, “The Enlightenment without the Critique: A Word on Michel Serres’ Philosophy,” in Contemporary French Philosophy, ed. A. Phillips Griffiths (Cambridge: Cambridge University Press, 1987), 89.

[4] On anthropology and theater, see Serres, Hermes, 3-14.

[5] Ibid., 57-59.

[6] For this coding of religion see Serres’s work generally and in particular, The Natural Contract, trans. Elizabeth MacArthur and William Paulson (Ann Arbor: University of Michigan Press, 1995), in which the word ligature is successively interrogated and “re-tied.”  On the madness and violence of decision as cutting or the creation of an edge, see, in different registers, Michel Serres, The Natural Contract, trans., 55; and Jacques Derrida, The Gift of Death, trans., David Wills (Chicago: University of Chicago Press, 1992), 53-82.

[7] Serres and Latour, Conversations, 64. This engagement with prepositions as the way in which beings incline and attach to one another has been life long. For a book-length treatment of préposés (prepositions, pre-placed entities, employees, postmen) as figures of mediation in art and literature, see Michel Serres, Angels: A Modern Myth (Paris: Flammarion, 1995).

[8] Serres and Latour, Conversations, 66.

[9] Ibid., 59-62.

[10] Serres, Hermes, 115-116.

[11] Jonathan Gil Harris, Untimely Matter in the Time of Shakespeare (Philadelphia: University of Pennsylvania Press, 2009).

[12] For a dazzling contribution, inspired in part by the work of Michel Serres, to an analysis of this order of stability, see Michael Wintroub’s analysis of the “metrological work of trying to establish, maintain, and extend the faithfulness of translation--in domains as diverse as literature, politics, religion, and commerce.” Michael Wintroub, “Translations: Words, Things, Going Native, and Staying True,” American Historical Review 120: 4 (2015): 1185-1227.

[13] Michel Serres, The Parasite, trans. Lawrence R. Schehr (Minneapolis and London, [1982] 2007), 224-227.

[14] For Serres’s attempts to think beyond the neutrality of parasitic chain with its excluded middles (“the third man”) towards successive figures of symbiosis along with what frequently sounds like despair at what he takes to be “appropriation through pollution,” writing as a form of excremental marking or re-marking, see, among others, Angels, The Natural Contract, and Malfeasance: Appropriation Through Pollution, trans. Anne-Marie Feenberg-Dibon (Stanford: Stanford University Press, 2011).

[15] This dissertation would provide the basis for Error, Misuse, Failure: Object Lessons from the English Renaissance (Minneapolis: University of Minnesota Press, 2003).

On the reciprocal nature of “fetish” or antiquarian labor, see “Shakespeare’s Kitchen Archives,” in Speculative Medievalisms: A Discography, ed., The Petropunk Collective, (Brooklyn, NY: punctum books, 2013), 179-200.

[16] For this modeling of deconstruction see, Simon Critchley, The Ethics of Deconstruction: Derrida and Levinas (Edinburgh: Edinburgh University Press, 2014).

[17] “Accidental Shakespeare,” Shakespeare Studies, 34 (2006): 90-91.

[18] Richard Burt / Julian Yates, What’s The Worst Thing You Can Do To Shakespeare? (New York and London: Palgrave Macmillan, 2013), 1-2. On provincializing as a strategy, see Dipesh Chakrabarty, Provincializing Europe: Postcolonial Thought and Historical Difference (Princeton: Princeton University Press, 2000).

[19] On play texts as projective, see “Shakespeare’s Kitchen Archives.”

[20] William Shakespeare, Henry VI, Part Two, ed. Roger Warren (Oxford: Oxford University Press, 2002), 3. 1. 381.

[21] William Shakespeare, Macbeth, ed. A. R. Braunmuller (Cambridge: Cambridge University Press, 1997), 5. 1.

[22] William Shakespeare, A Midsummer Night’s Dream, ed. R. A. Foakes (Cambridge: Cambridge University Press, 1984), 4. 1. 197. Subsequent references appear parenthetically in the text.

[23] William Shakespeare, Othello, ed. Norman Sanders (Cambridge: Cambridge University press, 1995), 1. 1. 89.

Monday, November 02, 2015

Running with Mary; or Against the Agon

a guest post by Sharon O' Dair

[Sharon presented this piece at the recent BABEL conference in Toronto in a session on "The Sweaty Scholar." We're honored to share it here at ITM -- JJC]

Mary Decker Slaney, currently age 58, is the only athlete to have held every American record from 800 meters to 10,000. To this day, she holds American women's records in the 1500 meters, mile, and 3000 meters. In 1973, Mary set her first world record. Some 6 years earlier, I ran with Mary on a club track team, the Long Beach Comets. I was 11 or 12, she was 9 or 10; she always beat me. She beat girls older than I—my friend Tina, for instance, who was 13 or 14 (and who, like me, subsequently completed a PhD in English). The coaches knew Mary would be a champion. She was 9 years old! They also knew I would not; one of my coaches—the handsome young one—told me my butt was big. I weighed 90 pounds. I don’t think I’d reached puberty. My partner says I have a thing about my butt. Wouldn’t you with that history?

But more important than the size of my butt is that I’m not slow; my best time in the mile was 5:25, which I ran when I was 12. Even now, I’m not slow. A couple of years ago, I ran a 5K race, my first in 50 years, almost. I don’t know what got into me, but I was in great shape. But I’m usually in pretty great shape, having run almost continuously at varying distances since running with Mary. I think I was influenced to run the race because earlier that year, I had decided to attend my high school reunion, the 40-year anniversary of our graduation. Crucially, though, the aptly named “Hurricane Run” was convenient, held on Dauphin Island, AL, a barrier island in the Gulf of Mexico where, until recently, I had a beach shack. The Hurricane Run wasn’t, I’m sure, peopled by scores of brilliant runners, but I did win my age group handily, by two and a half minutes (25:57/8:22). What is the lesson here? Though I didn’t know it 50 years ago, running with Mary taught me this: exceptional talent is rare. And obvious, obvious to those with the knowledge to judge such matters. In graduate school, looking at the professors in my department, I thought, “Stephen Greenblatt is Mary Decker Slaney.”

In the description of this panel, competition arises only once, in this question: “do physical activity, training, and competition provide merely a diversion (however salutary) from scholarly work, or are there ways in which they can also inform it?” I would suggest, however, that competition is the driving force of most colleagues’ athletic pursuits. We want to win, even if only our age group. We want to better our personal bests. I suggest this partly because I have watched dozens of junior colleagues become runners, often after having loathed sports for years. I watched them become obsessed with training and winning. Some can’t stop; a marathon isn’t good enough. Isn’t long enough. But mainly, I suggest this hypothesis because what I have had to “explain . . . to . . . colleagues,” and especially to fellow athletes, is not why I run but why I don’t compete and don’t run races (except for that one 5K). It’s as if colleagues cannot fathom just running five or six miles a day; often I’m told, “but you could easily run a marathon!” As if running a marathon should be an obviously desired goal! When pressed about reasons for not racing, which is most of the time, I say, with a wink, “Oh, I’m competitive enough professionally” or “I run so I can eat and drink and still look good” or “I don’t do mornings.” All of which are true, but as with so much, the truth is more complicated. I don’t run races because I don’t want to compete. Not even with myself. But for many colleagues the point of running is to train and to race and to win and so, I hypothesize, the way training and competition inform scholarly work is by reinforcing competitive norms in the profession. Scholarship is competitive, too, often brutally so. We want to win. To be right about King Lear. Or Beowulf. And we want others to be wrong.

Training and competition reinforce behaviors essential to competition in any arena. Isn’t that what we are told when we are young? The lessons learned in competition will carry you through life! And that is true: discipline, leadership, all of that. But competing in The Hurricane Run allowed me to recall why I came to resist racing and competition and how running without competing informs scholarship, too. To compete in the race, I trained for a week, maybe nine days, by which I mean I tried to see how fast I could run 3 miles. In doing so, I realized specifically and viscerally, bodily, what I’d long forgotten. To race and to win, I had to focus. My mind. On every step. On every step’s speed. Go, go, go, go, I said in my head to my legs. Don’t stop, don’t let up. When I lost focus, I lost speed. I lost speed when I let my mind wander—toward last night’s dinner, the birds overhead, pizza delivery at students’ houses, my current piece of writing, transitions! But letting my mind wander, especially about writing, is what running allows me to do. It’s thinking without thinking. Like those times, at night, when I wake up from sleep, knowing what I need to say next in an essay. This, for me, is one way running informs scholarship.

In a word, the problem with competition is that it makes me competitive. It makes me want to win. Deep down and not so deep, I wanted to win that 5K. And as Wendy Brown argues recently in Undoing the Demos: Neoliberalism’s Stealth Revolution, that is a problem. Not the racing per se, but that today, neoliberalism has made winning the point in every nook and cranny of our society—“from mothering to mating, from learning to criminality, from planning one’s family to planning one’s death” (67). And Brown insists, “the premise and outcome of competition . . . [is] inequality” (64). Inequality. In this sense, one can see the truth in my joking reply about why I don’t race: “Oh, I’m competitive enough professionally.” I don’t need or want more competition in my life. But the truth is, I don’t want to compete professionally, either. And as Brown argues, and as we all know, academia has not been immune to neoliberal, competitive rearticulation, with unfortunate and even possibly disastrous consequences. Brown is especially scornful toward faculty who “gain recognition and reward . . . [to the extent that their] methods and topics are increasingly remote from the world and the undergraduate classroom. . . . [which makes] it difficult to establish the value of this work to students or a public ” (195, 196). Paradoxically, this rearticulation has “weaken[ed] the capacity of liberal arts scholars to defend the liberal arts at the moment of their endangerment” (196).

Brown says her point is “not to castigate a rising generation of young scholars for participating in practices that index the degree to which all academic practices have been transformed by neoliberal economization” (196). When I read that sentence, I wrote in the margin, “Why not?” Why not castigate younger scholars? And ourselves, too, older ones? How else will we change the neoliberal economization, the competiveness, of our work unless we refuse it? Play the game differently? And isn’t this what this conference and The Babel Working Group want to do? Play the game differently?

Some years ago, at a conference, a rather elite and exclusionary one, a colleague, older than I and very successful, who had been through the war in the 1980s between feminists and new historicists, said to me, while gesturing toward our colleagues sipping wine: “all that fighting then, the ambitious competing and jockeying for position, the pain of it all, and here we all are, all in the same boat, all pretty much the same.” I nodded, said something, and then realized that soon I would be able to say the same of my cohort. And now I can. All pretty much the same. But there’s more. For, as Janet Adelman put it in 1973, soon enough, everything we have written will become “historical curiosities,” wrong or benighted or quaint to a succeeding generation (1). In the 1980s, while she and others were fighting—and, as a woman, I am glad she did—I was sort of competing as a graduate student, but sort of not. In the 1980s, for example, I didn’t do the work of recovery in order to write a dissertation on women’s writing from the early modern period. For me, the job market was too iffy to spend five or six years in an archive with writing I didn’t want to read, even if Stephen Greenblatt were to direct and New Historicism dictated that doing so would get me a job. To this day, I write what I want and do not worry too much about how it is received. And so, this is how running, not racing, truly informs my scholarly work: to remind me that winning is not, should not be, my professional goal. And I do not need to compete to do the work I want to do. Mary Decker Slaney and Stephen Greenblatt (and the few others like them) beat the competition, won the race, so we don’t have to.

Works Cited
Adelman, Janet. The Common Liar: An Essay on Antony and Cleopatra. New Haven: Yale University Press, 1973.
Brown, Wendy. Undoing the Demos: Neoliberalism’s Stealth Revolution. Brooklyn, NY: Zone Books, 2015.

Wednesday, October 28, 2015

Monster Classroom (Seven Theses)

by J J Cohen

Below you will find the draft of the afterword I was asked to compose for a new book edited by Adam Golub and Heather Richardson Hayton, Monsters in the Classroom: Essays on Teaching the Monstrous (forthcoming, McFarland). The volume contains some brilliant essays (including a terrific one by medievalist Asa Simon Mittman), an excellent introduction by Golub and Hayton, and a crisp, useful foreword by W. Scott Poole. Contributors represent varied disciplines (literature, philosophy, religion) and several essays end with a syllabus for the course they reflect upon.

My piece meditates on the project of the book as well as the ways in which my own monstrous work has been the product of the classroom. Let me know what you think.

[Edit: related, a little piece GW Today just did on me and my monstrous classroom]

Monster Classroom (7 Theses)

1. The Monster’s Body is a Pedagogical Body

Adam Golub and Heather Richardson Hayton invited me to compose an afterword to this rich, generous and timely book because of something I wrote more than twenty years ago, at a time when monsters were seldom discussed in college and high school classrooms – except, perhaps, as a foil to someone else’s heroism or an allegory for unthinkable vice. “Monster Culture (Seven Theses)” was first published in 1996. The essay introduced Monster Theory, a book that collects work by thirteen authors from a variety of disciplines, exploring what monsters reveal about the times and cultures they haunt: as demonstrations, as admonitions, as bodies feared and desired. The first collaborative project with the imprimatur of a well-respected university press to argue that monsters matter to critical theory and cultural studies, Monster Theory insists that its subject is more than a guilty pleasure or pop culture trifle. Its contributors examine intimate aliens without reducing them to psychological parables, or disembodied cultural metaphors. Among the creatures that populate the book are reanimated dinosaurs, sexy vampires, the undead of the Icelandic sagas, marauding Grendel, alluring hermaphrodites, conjoined twins, and demonized Muslims.

I wrote “Monster Culture (Seven Theses)” to provide an overview of the trans-historical, shared endeavor of Monster Theory. Yet the essay is also the product of a lively classroom. The Experimental College at Tufts University offers a welcoming home to weird courses that fit nowhere else. The program’s director was intrigued when I proposed a class on “Reading Monsters,” but wondered what students would gain from examining texts not for their resplendent prose or participation in a canon of masterpieces but for the ability to trigger anxiety, an archive of fear. At the interview I explained that western literature has been monstrous from the start (as a medievalist it comes easily to me to explain in boring detail why things that seem contemporary are actually quite ancient).[1] Gilgamesh, the Torah, Homer’s Odyssey, Vergil’s Aeneid, Beowulf, Shakespeare’s The Tempest, Shelley’s Frankenstein: these works offer a long chronicle of ghosts, dragons, cannibals, the misbegotten and the undead. I was hired – and allowed to design and teach my first monstrous course. Students enrolled simply because they were drawn to the topic. The class fulfilled no requirements, and was led by an unknown instructor (I had never taught at Tufts before). Thinking through with these young women and men the work of the monster across time was essential to composing “Monster Culture.” Born of pedagogical collaboration, the essay ruminates over the books and films we enjoyed together, from Arthurian myths and Interview with the Vampire to the Odyssey and The Werewolf of Paris. I owe the genesis of the essay to a program that had faith in nontraditional teaching at a time when academia was not all that hospitable to freaks, deviants, aliens, queers, and other unnatural things. “Monster Culture” is the record of a collective of learners eager to discover together where the errant tracks of the monster lead.

Now a familiar prod to high school and college writing assignments, the native domain of “Monster Culture (Seven Theses)” remains, twenty years on, the classroom.[2]

2. The Monster Finds Strange Welcome

Just before I taught that class at Tufts, I completed a doctoral thesis on the ubiquitous giants of medieval literature. I was astonished that no scholar had yet written about these fascinating stories, with their cannibalism, decapitations, licentiousness, violence, surprising comedy and (sometimes) amity and affection. These narratives insist that monsters are not limit cases, but separated from the familiar only by weak and traversable boundaries. What is the ghastly giant after all but the human body writ large? Completing my dissertation had been an exercise in solitude, though: it turns out that little had yet been published on giants because most academics found monsters unworthy of serious attention. I assembled Monster Theory in an attempt to convene a community of scholars willing to accept the monster’s invitation to knowledge. While the manuscript was under consideration at the University of Minnesota Press, I used “Monster Culture (Seven Theses)” as my job talk at the George Washington University. I framed monster theory as a way to teach cultural studies without the temporal segregations that structure most literature departments. The monster crosses history promiscuously, and makes a ruin of neat periodizations. Had I not been offered the GW job (and I was not at all certain I would be: it was my third and last year on the market, and a faculty member stormed out during the Q&A), then I likely would not have published the essay, nor made a career out of welcoming monsters into the classroom. My monstrous obsessions continue to guide the critical conversations that unfold when I teach – no matter if the text in question has been composed by Geoffrey Chaucer, Marie de France, Octavia Butler, or China Miéville.

Reading through the wonderful essays collected in Monsters in the Classroom has reminded me of how powerful a presence the monster asserts. These thoughtful pieces suggest numerous new ways of harnessing that power to provoke student engagement. Writing on what the grotesque enables in the classroom Nancy Hightower describes the monster as “a moment rather than an object, as an action rather than thing,” a description that well conveys the figure’s activity. Monsters are also an invitation to consider the previously unthought (even if, as Bernice M. Murphy points out in her essay, the invitation to the unfamiliar is often concealed within a “Trojan Horse” of familiarity). The uncanny, the abnormal, the impossible and the queer are especially intriguing to young people at that pivotal point in their intellectual development when they are critically evaluating their past and all things seem possible. As Jessica Elbert Decker emphasizes, students are trying to figure out what normal consists of, what belongings and what differences they may desire. Pamela Bedore describes how her capstone seminar addresses feminism, gender equality, queer sexuality through vampires. Kyle William Bishop vividly demonstrates what unfolds when we literally transport our students outside of comfortable and accustomed space, into a shared and experiential pedagogy. Phil Smith has his students zombie-walk. Heather Richardson Hayton goes farther, asking her students to participate in a zombie apocalypse, a “transformational learning exercise” that spurs some to realize they are the very creature they fear. Through Japanese monsters Charlotte Eubanks encourages her students to realize the limits of their American imagination, while through demons and ghosts Joshua Paddison invites his classroom to grapple with their own religious beliefs. Our students are often deciding how much family inheritance to carry forward, how much uncertainty and risk to embrace, how much affect (an essential component of learning to which these essays repeatedly return) to embrace and share. Brian Sweeney movingly details how the monster might be creatively deployed within the classroom to fight against the dreary forces of educational standardization, “fighting monsters with monsters.” To welcome the monster means treading strange byways rather than roads of predetermined destination.

3. The Monster Arrives in Crisis

Students depart our classrooms to difficult, attenuated futures. College graduates are frequently saddled with a decade of debt from loans that assisted in covering exorbitant tuition charges. They compete against friends for jobs that do not pay a sufficient wage. They discover that to the machinery propelling the US economy, young people are a resource for the extraction of labor at minimized cost. Though it might reward entrepreneurial drive and some forms of creativity, this system mandates compliance and cares little for the psychological and intellectual well being of those who sustain it -- and even less for the animals it reduces to products. Natural resources and what had been wilderness become territory and consumable commodities (Adam Golub gets at this process well in this book, writing about the classroom use of literature and film to think about monsters and the space of the natural). People who have benefited from this economy seem to feel little impulse to ensure the thriving of those who follow, rendering the road behind them narrow, steep, and lonely. The Earth meanwhile is rapidly losing species diversity, green space, and drinkable water while gaining an overheated climate and catastrophe-limned future. Economic disparity and environmental injustice flourish together.

Prospects are especially bleak for those who have studied to earn an advanced humanities degree. The number of permanent teaching positions is small and continues to dwindle. The life of the mind has always been an unreliable way to make a living, but in the course of the last decade a constricted market has only worsened. Precarious employment is the new academic norm. When I wrote “Monster Culture (Seven Theses)” I was working as an adjunct instructor seeking a more stable job – preferably one that included health insurance and maybe sufficient salary to start repaying my student loans. Looking to supplement my income and attempt something new in the process, I applied to teach a night course on monsters at the Tufts Experimental College. I would like to think that experience assisted in my hiring by GW the following year, and acknowledge I was very fortunate to secure that position. Two of the early career contributors to Monster Theory never found permanent academic jobs, and eventually left the field. Two others lost positions that they had held for a while, the result of complicated tenure struggles. One suffered a breakdown. Those contributors already in established positions when Monster Theory was published have continued to have good careers, but like all of us within the university system have repeatedly had to fight against cost cutting measures that disproportionately impact the humanities as well as administrators and trustee boards who imagine that postsecondary education ought to be run like a profit-oriented business (where profit is numerical and easily assessable quantity rather than an intellectual and long term gain).

Each year I write a great many letters of recommendation for would-be teachers seeking positions at colleges, universities, high schools. Not enough are offered the stable, well supported positions they deserve. Many at the front of the classroom do not have the resources, remuneration or security they merit. Academic life is precarious, limned by the monsters we dream as well as the monsters we dread. Pamela Bedore observes in her essay that the classroom monster can make the teacher “sometimes feel vulnerable.” We are vulnerable. So are our students. And not just emotionally or rhetorically.

4. The Monster Stands at the Door

I am composing this afterword a few days before Halloween 2015. This year has so far seen fifty-two shootings on school grounds and twenty-three on college campuses. Since the Sandy Hook Elementary School massacre of 2012, at least 149 shootings have occurred at schools. Campus security is among the “one button dial contacts” on my cell phone, just in case an incident should unfold at GW. We have been told that we should be able to make such a call while sheltering behind a desk or table. My daughter’s middle school practices the protocols for an active shooter (Code Blue) once per quarter, just in case. While I was writing this essay I received email, text and voicemail alerts from her school that her classes had spent the morning in lockdown due to two unspecified threats. Emergency procedures for most schools and universities mostly entail remaining quietly in a locked or barricaded room with the lights off, and hoping.

It is easy to think that the monster is an entity external to us rather than a creature of our own creation, the product of a contemporary love of destruction, explosion, violence, guns. All the firepower in the world will not stop the monsters we create, but fewer guns might at least mean more people act less monstrously. And more students remain alive.

5. The Monster Breaches the Borders of the Possible

Relieved to have Monster Theory placed under contract with a press, I did not think much about the future of the book or my essay. Two decades later, the volume sells well enough never to have gone out of print. “Monster Culture (Seven Theses)” has inspired conferences, museum exhibits, television shows (John Logan mentions the worlds the piece opened to him in his pitch to sell Penny Dreadful to Showtime). When Disney and Pixar were sued for copyright infringement after the release of Monsters, Inc., their legal counsel retained me as an expert witness on the history of monstrosity – mainly because a paralegal working for the firm had read the essay in a college writing class. Monster Culture and “Monster Theory” did not arrive out of nowhere. Many other scholars were working on monsters and abnormality at the same time (most visibly, Rosemarie Garland Thompson, Jack Halberstam, and Jeffrey Weinstock). Monster Theory is only one participant within a collective scholarly endeavor that continues. W. Scott Poole makes this point eloquently in his excellent foreword to this book, tracing the long lineage of contemporary teratology.

Because “Monster Culture” is so historically promiscuous -- placing Dracula in the good company of revenants from Norse sagas, the creature from Alien and the dinosaurs of Jurassic Park alongside Polyphemos and the biblical Nephilim, the American West with the Delta Quadrant – the essay has been repeatedly cited, contemplated, and taught, especially in first year writing programs. The downside to its popularity is that the piece really is twenty years old and more accessible, more vibrant and more nuanced work is widely available. When one text is too often returned to as source, the vitality of what came alongside and afterwards is easy to obscure. Much of this more recent work has refined, rethought, and at times rejected what is contained in “Monster Culture” – often through conversations scholars have staged in their own classrooms. Rick Godden gets at this collaborative nexus well in the rubric he composed for a composition course he teaches at Tulane that focuses on the monster: “Because of their inherent ambiguity, monsters encourage open-mindedness, productive questioning, careful scrutiny, and flexible research, all of which are the hallmarks of good scholarship.”[3] Because the field is lively – because the classroom is a dynamic habitat -- the monster endures.

6. Desire for the Monster is Desire for a Different World

Monsters in the Classroom demonstrates how these figures engage students and hone critical faculties. The pedagogical models this book offers are inspirational. Most invite students to think with the monster and thereby queer what passes itself off as natural, or to discover in the supernatural a realm that exists not at some impossible distance but intimately, alongside everything that appears ordinary. To romanticize monsters is dangerous, since these creatures frequently incarnate misogyny, homophobia, xenophobia, anti-Semitism, racism. But the monster also offers alternatives to the everyday world, realms in which solid reality dissolves into possibility. As I composed this afterword I asked fellow teachers on social media to contemplate what the monster offers their classroom. Most stressed the esprit de corps that monstrous study engenders, so that students make lasting friendships through difficult but shared pedagogical endeavor. Though the monster is often inimical to collectives, a reminder of who has been left outside at the closing of the door, imagining the future through the monster paradoxically builds community.

7. The Monster Stands at the Threshold of Belonging
“Monster Culture (Seven Theses)” has been used to teach composition, cultural studies, American studies, religion, literature, philosophy and critical theory. Sometimes the essay has been a shared text in required Freshman writing classes (Columbia, Rutgers, Indiana University). Often the piece has found its way into a course on monsters. I have received as a result a thank you card signed by an entire class; the Seven Theses inscribed on cardboard tablets as if they were the Ten Commandments; and photographs of Halloween costumes inspired by the essay. Although I worry that the writing style is not a good example to emulate (when I read the essay now it seems to me rather stilted and precious), I am always pleased when a student who has been thinking with the essay writes to me on Twitter or Facebook or email to tell me what worlds they have opened for themselves through their classroom experience. Monsters in the Classroom is a record of how such lively and creative communities continue to be engendered through a variety of catalysts -- and this book will no doubt be a trigger to many more such collectives.

Twenty years after its publication the essays collected in Monster Theory have been joined by so many monstrous books, essays, and websites that call out for classroom conversation that it is difficult not to agree with Adam Golub and Heather Richardson Hayton that we live in “a peak moment of monster pedagogy.” The Fellowship of Monsters is always looking for new members. Sometimes (as Asa Mittman so well describes in his essay) that invitation is a trap, a call to swim dangerous waters. Yet if thinking through what a monster means and does (as character in a narrative as partner in the process of learning) invites a student to frame critically their own cultural moment, as well as perhaps the long histories behind that moment’s formation, then the monster classroom is in the end a good place in which to dwell.

[1] Not that this super power is reserved to medievalists: W. Scott Poole makes this point about the Western canon well in his introduction to this book.
[2] I want to thank Hope D. Swearingen for sending me a thorough description of how she uses “Monster Culture (Seven Theses)” in her AP English class at Bastrop High School. I am also deeply grateful to the following scholars, who shared reflections on using the essay in the classroom with me, providing valuable material for what I have written in this afterword: Bonnie Jett Adams, Tracy Adams, Shaun Bryan, Brantley Bryant, Sakina Bryant, Kristi Janelle Castleberry, Melissa Ridley Elmes, Fen Farceau, Ted Geier, Rick Godden, Ana Grinberg, Simon Grüning, Brian Hardison, Jenny Howe, Beth Belgau Human, Dan Kline, Sally Livingston, Kathleen Long, Roberta Magnani, Lauryn Mayer, Asa Simon Mittman, Adam Roberts, Emily Schmidt, Corey Sparks, James K. Stanescu, Liza Strakhov, Hope D. Swearingen, Arngrímur Vídalín, Amy Vines, David Wallace, Jeffrey Weinstock and Helen Young.
[3] I am thankful to Rick Godden for sending me a detailed description of the three monsters courses he teaches, and for conversations about monsters in the classroom.

Tuesday, October 20, 2015

Why the Humanities Prepare You for Most Any Future, Part the 1000th

by J J Cohen

Bachelor of Arts in Literature, McGill University (1994)

Bachelor of Education University of British Columbia (1998)

Teacher of Drama, French, English, Social Studies, and Math (1999-2002)

Prime Minister of Canada (2015)

Monday, October 19, 2015

Another world has always been possible: A Letter to Elysia Crampton @ DIS Magazine

by J J Cohen

ITM readers may be interested in this collaborative piece now up at DIS Magazine. I wrote it with Elysia Crampton while meditating on her latest album, American Drift (and lyrics from its songs drift throughout the letter I composed). You can read more about her work (including the importance of location, and being trans, and homes and belongings, and using sound as a fabric to create story) in an interview with Spin here and Fact here and Fader here. Elysia and I have been FB friends for a while, I think because we have Drew Daniel's friendship in common. She read my book and sent me a letter about doing something for DIS. I returned her note intercut with my own letter. She sent it back with her letter removed and a story told with images plus a soundtrack replacing her words. And thus the DIS piece. All of this was done by email: Elysia was moving from the Shenandoah to Santa Fe in the Yungas (a town her grandparents founded) and then to her farm in Rosario, Pacajes. I was in New Zealand, far from home.

I think this is my favorite line from what I wrote:
Another world has always been possible. Say fuck it and start.

But you'll see that is only one of many topics considered. Let me know what you think.

Wednesday, October 14, 2015

Premodern Disorder: A Graduate Student Conference @ GW (CFP)

Premodern Disorder

GW Medieval and Early Modern Studies Institute 
Graduate Student Symposium


Keynote Speakers:

Sharon Kinoshita, Professor of World Literature & Cultural Studies at UC Santa Cruz

Drew Daniel, Associate Professor of English at Johns Hopkins University

Premodern Disorder invites graduate students of medieval and early modern literatures to examine failures of taxonomies, outbreaks of disorder, and manifestations of the incomprehensible. Topics under investigation may include:
  •   Affect, emotion, and humoral theory
  •   Translation, globalization, and cultural-contact
  •   Apocalypse and catastrophe; or premodern ecologies
  •   Taxonomies, animality, agentic objects
  •   Monstrosity and the body
  •   Economics, politics, and religion
  •   Waste and dirt; or cleanliness and the home
  •   Allegory and utopianism
Visit the conference website: for more information, as well as a complete Call for Papers. 

Abstract submissions and Panel Proposals by October 30, 2015 (note extended deadline) to

Monday, October 12, 2015


by J J Cohen

A quick rant, because ... enough. The University of Maryland has been plagued with financial troubles that have worsened the lives of its educators, staff and students. And yet its expenditures on football, in the past and into the future, remain insane.

Steven Salzberg writes at Forbes that the University of Maryland will "pay $4.7 million to buy out the current coach, Randy Edsall, hire a new football coach and pay him at least as much as the old coach," and meanwhile "continue to impose unpaid furloughs and pay freezes on academic staff across the board." UMD President Wallace Loh, what are you thinking? In what crazy world are sports coaches worth almost $5M in university funds while those who teach and run the place day to day see significant decreases to their quality of life? Aren't the students who attend the university and the teachers and researchers who are at the supposed heart of its mission the ones worth that money?
Calling upon universities to privatize or divest from sports, Salzberg also writes this, and I could not agree more:

Listen, sports fans: football is not the reason we have universities. Universities exist to provide education, no matter what the (sometimes rabid) football boosters may say. Some American universities do extremely well without having a team at all. Outside the U.S., universities have no major sports programs at all–the students enjoy sports, as all young people do, but the universities focus on what they do best ... The spectacle of U. Maryland spending $4.7 million simply to buy out its current football coach, when the university is desperately trying to save money for its core mission, demonstrates how corrupting the influence of football has become.
Professional sports disguised as college activities do not belong on college campuses. Knock down the stadiums. We need more classrooms, presses, libraries, museums, coffee houses, green space, scriptoria, gardens, ruins to contemplate, and other public declarations that the life of the mind is more important than a bunch of guys chasing an odd shaped ball while a crowd gets drunk and eats nachos. 


Friday, October 09, 2015

On Being a Professor Who’s More Teacher than Scholar

a guest post by Arthur Bahr

To me, two of greatest things about BABEL are its members’ willingness to be radically open about what they most deeply believe and care about, and their ability to inspire others to be comparably open and honest. That sense of appreciation was renewed as I followed the How We Write posts several months ago—now brought to wonderful fruition!—and it inspired me to ask if I could share the following piece on this blog. 

A few words about what it is and how it came about. Lorna Gibson, a friend of mine in Materials Science and Engineering, asked me this summer if I would give a brief presentation this fall on how and why I became a professor, for a series she was organizing with Chancellor Cynthia Barnhart. The goal of the series is twofold: to demystify academia as a profession, especially for those who might be disinclined to think of it as “for them,” and to humanize MIT professors more generally. The first goal is important since fewer and fewer of our undergrads, whatever their background, are going on to grad school, and even those who do increasingly prefer industry to academia. The second goal is important since surveys suggest that a disturbingly large percentage of MIT undergrads graduate without knowing at least three professors well enough to feel comfortable asking them for a letter of recommendation. (My suspicion is that this problem afflicts many R1s, not just MIT, but that may be due to prejudices that I touch on more fully in the talk below!)

Since perfectionism, imposter syndrome, and fear of failure are real scourges among MIT students, and since I am lucky enough to have tenure and thus the ability to say what I please in official venues, I decided to be a bit radical in my talk, and come out as a teacher who happens to do scholarly work, rather than a scholar per se. And since figure skating has been on my mind a good deal, as some readers of this blog know, I decided to make that my metaphor. (Do watch the YouTube clips within the talk; they’re worth it!)

On Being a Professor Who’s More Teacher than Scholar

Today I’m going to talk about teaching, and figure skating, and scholarship, and imposter syndrome, because together they explain why I wanted to become a professor, why I almost left academia at various points (including after getting my job here at MIT), and why I’m really happy I didn’t.

I was a pretty serious figure skater as a kid, and when I started skating, in the late 80s, the sport had two very different components, only one of which most of you would recognize today: the jumping and spinning and music and costumes and all that. The other component consisted of using your blade to first draw and then retrace extremely precise patterns, or figures, onto a blank sheet of ice. This is how figure skating got its name: you literally skated figures. As part of an effort to demystify this rather strange-sounding practice, ABC aired this clip of Scott Hamilton skating a paragraph bracket at the 1984 Winter Olympics, and thanks to the magic of YouTube, here it is:

4:45-6:30 of

I was six years old when I saw that, and I was totally captivated. It was so weird and cool, this idea of trying to create those perfectly formed shapes on the ice. But as soon as I started staking seriously, I discovered that figures required a truly insane amount of work. Free skating came easy to me: I landed the first Axel I ever tried, and I got my double Lutz within a couples years of starting to skate, but with figures, my progress felt absolutely glacial. (See what I did there?) Instead of “I worked really hard at this jump and now I can rotate three times in the air instead of two,” it felt like, “I worked even harder at making my figures perfectly round and now instead of really ugly, bulgy circles I make very slightly less ugly, very slightly less bulgy circles.” This is not a very satisfying form of accomplishment—especially if you’re, say, twelve. As a result, I launched even more enthusiastically into my free skating, where I could express myself and play to the crowd (even if the crowd in question was just my mom drinking her thermos of coffee at 5 in the morning, God bless her).

So what has this got to do with my other topics? Well, I started teaching around the same time, when I was twelve and my little sister was six. That seemed the right moment for her to learn fractions, so I asked for a miniature blackboard for Christmas, and I set up a school for our legions of stuffed animals, and she helped them do their quizzes, and I graded them and I had office hours, and this is what I did for fun, mind you. And ever since, teaching has reminded me of free skating, since they’re both a kind of performance that tries to draw the audience in and make everyone in the room (or ice rink) feel like part of a shared experience. When you’re on, you’re both totally focused on what you’re doing in the moment and somehow going with a flow that’s bigger than you are and that you don’t totally understand. But of course you’re not always on: sometimes your grand pedagogical gambit is a flop, sometimes you fall on all five of your jumps, and sometimes it’s not a disaster, it all just feels sort of flat for no obvious reason. And those failures both sting and make you want to get back out there and nail it the next time.

Now I’m not breaking new ground here: any musician or athlete or actor will instantly recognize both kinds of feelings I just described, the “everything is clicking, this is awesome” and the “gah, that was awful … brood … I need to get out there and nail it next time” experiences. I go into this similarity between free skating and teaching because the other side of my job—of any professor’s job—is research, and research has always felt to me like compulsory figures. While they were sometimes rewarding, figures were not fun; they were work, grinding, repetitive labor. Any progress I made at them was nearly invisible to me and felt totally invisible to everyone else. And here it’s worth emphasizing how solitary, even lonely, research in the humanities can be. Unlike in engineering or the sciences, we generally don’t have labs or research groups: it’s just you, your books, and your thoughts, which is almost exactly how the famous figure skating coach Slavka Kohut described doing figures: “You’re alone with your thoughts, and you’re competing against yourself. It’s how you place your thoughts with your feet, and how the two work together.” This kind of isolated, introverted work is just not what I have ever felt like I’m best at. But it’s not even really a question of what you’re good at; it’s more about what’s exciting to you, what gets you up in the morning, how you self-identify. Sort of like are you a cat person or a dog person: are you a figures skater or a free skater? I’m a free skater; I’m a teacher.

And this is where we get to imposter syndrome. Whatever your academic discipline, teaching is a tiny, sometimes even nonexistent part of the training you get in a PhD program—and that right there tells you all you need to know about academia’s priorities. In my department at UC Berkeley, teaching was presented as the chore you had to do to get your ridiculously paltry stipend if you weren’t good enough at research (i.e., “smart enough”) to get a fellowship, and because of that, and gazillions of other cues given by your professors and fellow grad students, it was obvious that research is where it’s at. Which means that if you’re more inspired by teaching—if the reason you first wanted to become a professor is so that you could share your passion for the material with students—you quickly realize that you’re a cat person in a dog person’s world.

Now, it should surprise none of you to learn that when MIT hires professors, and when it considers them for raises and promotion and the all-important tenure, it cares above all about research. In this it is like every other R1 (which is professor shorthand for “first-tier research university”), and R1 jobs are the jobs that, as soon as you start grad school, you’re trained to believe that you should want, partly because by definition people get PhD’s at research universities, whose professors tend to absorb their institutions’ priorities and incentive structures into their own value systems.

So when I went on the job market and saw an assistant professorship in medieval literature at MIT, I didn’t for a moment think I would get the job. I didn’t even want the job. I wanted to end up at a small liberal arts college like the one I went to, so I could teach intimate seminars of smart, earnest English majors and have them over for dinner at the end of the semester to meet my boyfriend and my cats. I had it all planned out, and the only reason I even applied is that the academic job market in the humanities is so awful that you just apply for all the jobs and worry about happiness later, if at all. (Of course the notion that happiness is a second- or even third-order concern also says something pretty damning about academic culture. We can talk about that later if you want.) So when I actually did get the job, I was super excited but also sort of confused. It was as if I’d won the figures event at a competition—which I can assure you I never did—and I felt like saying to my soon-to-be colleauges, “I know you’re the judges and all, but how carefully did you actually look at my figures? Can I free skate for you? That’s what I’m actually good at.” And they were like, “No, we don’t actually need to see your free skating [my flyback had no teaching demonstration or interaction with students] since figures is what we really care about. In fact, as soon as you sign this offer letter, you’re going to be competing under a system in which figures are worth 95% of your score and free skating is worth 5%, or maybe it’s 90/10, or 98/2, I mean of course there isn’t a mathematical formula, we evaluate everyone on a holistic and case-by-case basis, etc. … BUT DON’T FORGET, figures are what really matter. Oh, also, if your figures aren’t good enough, after seven years we’ll fire you and you’ll probably never skate again.”

The reality of MIT’s tenure system, which I just translated into figure skating terms, didn’t sink in until I arrived here. And that’s when I really panicked, since I was sure that even though I’d somehow fooled them in my interview and my job talk, there was no way I could pretend to be a dog person for seven years. Sooner or later, and probably sooner, one of the judges was going to look at my tracings and get this tone of incredulous scorn in her voice as she said, “How did you even manage to pass your first figure test?” (That is a quote from a judge who failed me on a later, harder test.) And so I did what I did as a skater—as a kid: I threw myself into the stuff I already knew I was good at and kind of ignored my research for the first year and a half or two years I was here. (I do not advise doing this, by the way.) I looked for other jobs. I even talked to my senior faculty mentor about whether I could somehow convert my professorship to a lecturer position, which would have meant a huge cut in pay, prestige, and job security. (He talked me out of it, for which I will always be grateful to Noel Jackson.)

Underlying all this was an almost crippling fear of failure. I’d reached the seductive conclusion that if I didn’t really try, I couldn’t really fail. If I focused almost exclusively on my teaching, which I really enjoyed, and didn’t fully commit to the research that I knew MIT considered incalculably more important, then when I got fired I could say to myself and my family and my friends and my advisor and all the people I was sure to disappoint, “You know, it just wasn’t a good fit; I’m a cat person, they’re dog people; I’ll be so much happier at the University of Northeastern South Dakota” or whatever backwater I got consigned to. (For the record, my dad is from South Dakota, so I can make South Dakota jokes.) Imposter syndrome and fear of failure are both real scourges among students at MIT, so I want to go on the record and say that lots of your professors, like me, struggle with them, too.

Please note that I say “struggle,” in the present tense. These are not fights that you ever conclusively win. Grappling with imposter syndrome and fear of failure and perfectionism (yet another scourge) is more like learning to live with addiction: the shadows are always there, and you just have to develop coping mechanisms to make it through, partly by being open and honest about those struggles. So this talk doesn’t come with an inspirational conclusion about how I ultimately conquered my fears and became the consummate dog person I never realized I could be. I would still rather design a cool new class than write a scholarly article. I will always be happier and more alive in a classroom than I ever will be alone with my books. That’s who I am. Learning to say, “I am not a scholar, per se; I’m a teacher who happens to do scholarly work,” was incredibly liberating, not just because it was true, but also because it helped me separate the research part of my job from my sense of self. Scholarship was just that, a job, a job that I could learn to enjoy and be good at, but not some kind of calling or vocation, and that was fine. For other friends and colleagues whom I deeply respect, it’s the other way around, and that’s also fine. Teachers aren’t inherently nobler, any more than scholars are inherently smarter. All this helped mitigate the miasma of fear that surrounded my all-important “scholarly production,” and that was crucial, because in scholarship, just like in teaching (and figures, and free skating), fear cripples performance. Scholarship that’s afraid to fail, to be found out, to be seen to be stupid, is almost always boring, and I’d infinitely rather read an essay that I passionately disagree with or even hate than one that puts me to sleep.

As soon as I stopped feeling guilty about not passionately loving research, it was easier to believe that it did actually have value—and here I’m referring not to mine in particular, but rather to humanistic scholarship more generally. Guilty secret number whatever: it has always been really hard for me to feel like my scholarship matters, like, at all. I wrote a book about Chaucer and medieval manuscripts. Some people thought it was great and at least one very cranky reviewer thought it was awful, but fortunately enough people thought it was good enough that I got tenure. But even if everyone thought it was the most brilliant book about medieval manuscripts EVER, it still wouldn’t feel as significant to me as teaching a class in which a few of my students arrived at new ways of looking at the world. Like many clichés, the one about teachers changing lives is a cliché because it’s true, and it's hard for me to imagine anyone's life being changed by reading my book or, quite frankly, ANOYNE’S book about medieval literature. Here again, comparison with compulsory figures is instructive. Lots of skaters hated figures not just because they found them boring and lonely, but also because they felt useless. How on earth did etching perfect circles help us connect with an audience, communicate to others the joy of movement, or music, the way free skating did? Lots of brilliant figures skaters were terrible free skaters, just like lots of brilliant scholars are terrible teachers.

Partly because no one could articulate a compelling answer to these kinds of questions, compulsory figures were eliminated from international competition in 1991—with the result that, today, most of you have never heard of them. This initially made adolescent skating Arthur very happy, but the joy faded surprisingly quickly. Free skating somehow felt less fun, and ironically less free, when it was no longer the activity that I had to earn by diligently practicing my figures for an hour. As soon as I wasn’t doing them any more, I appreciated how complementary these two components of the sport really were. Not only did figures instill a truly ferocious degree of mental discipline and focus, they also forced you to be aware of every single part of your body, since they were so precise, the margin for error so tiny, that a single muscle out of place literally anywhere could mess up your turn at the top of the circle, or your change of edge, or any of the gazillion of other things that were there to be messed up. In this sense, figures were a bit like basic (as opposed to applied) research in the sciences: intense labor with no immediately obvious practical value, but with the potential to pay off in unanticipated ways down the road. (Case in point: I started rock climbing and bouldering last year, and when I was struggling, my best friend who’s a hard-core climber said, “Just remember, every part of your body matters, not just your arms,” and I was like, “oh, it’s figures,” and then I thought about each problem on the climbing wall like it was a figure, and it made a big difference!)

But let’s even concede a skeptic’s contention that skating compulsory figures, or deciphering the handwriting of medieval manuscripts, isn’t useful. They are nonetheless impressive, specialized skills. Our word technology comes from the Greek word techne, which means skill, art, or craft—your ability to make or do something as opposed to simply comprehend or contemplate it in the abstract. So etymologically, technology is our shared set of crafts, skills, and capacities. When one of those skills is forgotten, rendered obsolete by a more recent form of technology (or a changing economy), something of our shared heritage is lost. And this is why the demise of figures makes me so sad. Done well, both the activity and the results are very beautiful, as this short clip of Debi Thomas doing a paragraph loop at the 1988 Olympics shows:

1:03-1:47 of

But pretty soon, the coaches who knew how to teach figures start will dying off, and then this craft will no longer be part of our shared human technology. Now I’m not seriously proposing that we all go back to thatching the roofs of our kerosene-lit cottages, or that MIT become the Massachusetts Institute for the Preservation of Historically Significant and Inherently Beautiful but Obsolete Skill-Sets (although that would make us Miph-Si-Boss, which given our love of inscrutable acronyms seems quite appropriate). But I do think that it behooves all academics—and everyone contemplating an academic career, as I hope some of you are—to adopt a more capacious understanding of what’s worth learning and doing.

One of my colleagues in medieval studies calls the modern university the “brainforest” [shoutout to LeVostreGC!] and I think it’s a valuable pun because like the Amazon, the university is a complex ecosystem. Some parts are less obviously attractive or useful than others, but the depth, the richness, and even the utility of the whole are diminished when any particular constituent piece atrophies or is clear-cut out of existence. This is not just an exhortation to remember medieval literature when you’re an engineering professor—or rich alum—on some committee whose purse-strings control the fate of my future colleagues. It’s also a reminder to me, when I feel like my research doesn’t matter, that it actually does. Not in the same way as my teaching, but as part of an infinitely complicated universe of skills and modes of thinking and doing things with your brain. I wish that the skating community had appreciated figures enough to save them, but their loss inspires me to do my damnedest not to let the same thing happen to the practice of interpreting old poems that don’t get read that much any more. Most of the time I don’t love doing research—but I feel incredibly grateful that I have to. Most of the time I do love teaching—and I feel incredibly lucky that I get to. And that tension keeps both activities livelier and more vital than either would be by itself.


The conversation after the talk was quite wide-ranging. People had a lot of questions about the nitty-gritty of figure skating, so we spent a good deal of time on that, but a number of people also said they had experiences similar to mine, of feeling like a bad academic or even a bad person for valuing teaching over research. One said it’s why he left academia for biotech after getting his PhD in biology from Cornell. Another, a grad student in Mechanical Engineering at MIT, said that when she asked her advisor how she might get a more-teaching-than-research-oriented job, he said bluntly, “Those don’t exist.” And even at less research-intensive institutions than MIT, “scholarly production” (which is generally equated with “published peer-reviewed writing,” although it shouldn’t be) is often far more important than teaching for tenure and promotion. (Of course, how to fairly measure teaching effectiveness is a vexed question, but it’s not clear to me that measures of scholarly excellence are any less problematic; both are subjective, manipulable, inflected by unconscious bias, etc.)

As for what I’d like people to take away from all this: I’d encourage everyone working with graduate students to make sure those students know that there are lots of different ways of being a professor; that the version of “professor” their advisors chose to (or had to) embody in order to succeed at an R1 is not the only possible or admirable one. I suspect that most of us would agree with that proposition if pressed, but such a belief—in the legitimate diversity of values and goals within academia—often fails to make it through to grad students. Making sure we all convey that belief will help keep our profession attractive to the widest possible range of smart, committed people—and, in turn, give our profession a broad and diverse set of advocates for the challenging times we face now, and seem sure to face in the future.