Tuesday, November 30, 2010

A profound and disquieting book

by J J Cohen

Miriamne Ara Krummel's Crafting Jewishness in Medieval England is soon to be published in the New Middle Ages series. Some advance praise:
“Working at the intersection of medieval, postcolonial, and Judaic studies, Krummel has created a wide-ranging and interdisciplinary work of scholarship that re-evaluates how anti-Judaism works. Stressing the absence of the Jews even when physically present (transformed into a figure, a symbol, a deicidal monster), the book traces Jewish representation by Christians and (powerfully) by Jews living amongst Christians. Even when expelled from the island in 1290, the Jews continued to haunt, so integral were they to medieval Christian identities. Crafting Jewishness in Medieval England is a carefully researched, disquieting, vigorously argued, and profound book."--Jeffrey J. Cohen, Professor of English and Director of the Medieval and Early Modern Studies Institute, George Washington University
"In Crafting Jewishness in Medieval England, Krummel explores the varied and arresting medieval pre-histories of modern racism and antisemitism. Krummel's lively, eloquent and wide-ranging exploration of the Jews' 'virtual presence' in medieval English culture asks timely and challenging questions about religious violence and group identity. Krummel's book not only helps us to think about the vanished English Jews, expelled in 1290, but also suggests provocative and innovative ways of recovering and interpreting the haunting, multiple presences the Jews continued to play in the English, Christian imagination."--Anthony Bale, Reader in Medieval Studies, Birkbeck College, University of London
Full details here. Congratulations, Miriamne!

Monday, November 29, 2010

Guest Post: Michael O'Rourke, "After"

by MICHAEL O'ROURKE

For Jeffrey Jerome Cohen

Finir, Commencer/ To End, To Begin
There is no end without beginning. How could the end be known as end if it weren’t recounted by someone? –Jean-François Lyotard, Soundproof Room[1]

My talk is organized by a series of watchwords: words to watch out for and words which watch over me.

Worlds

Michael Hardt and Antonio Negri, in their most recent book
Commonwealth[2], caution against a newly arisen apocalyptic tone in contemporary politico-philosophical thought, a tone which finds its fullest expression in Slavoj Žižek’s latest gloomy opus, Living in the End Times[3]. We are, if we are to believe what we read, and I am taking some randomly chosen examples as indicative of this trend, after the subject (Rei Terada)[4], the human (discourses of the posthuman[5]), sex (Andrew Parker and Janet Halley’s special number of South Atlantic Review[6]), God (Altizer[7]) and the death of God (John Caputo and Gianni Vattimo[8]), finitude (Quentin Meillassoux[9]), theory (Terry Eagleton[10]), poststructuralism (Colin Davis[11]), the humanities and so on (even After Derrida, the title of so many essays and special journal issues since his death in 2004 [including a special issue of Mosaic][12].

Žižek informs us that the underlying premise of his latest book is “a simple one”: “the global capitalist system is approaching an apocalyptic zero-point. Its ‘four riders of the apocalypse’ are comprised by the ecological crisis, the consequences of the biogenetic revolution, imbalances within the system itself (problems with intellectual property; forthcoming struggles over raw materials, food and water), and the explosive growth of social divisions and exclusions”
[13]. To counter this preoccupation with, even fetishization of, ends, and the language of catastrophization (embodied in so many newisms, postisms, and other small seismisms, as Derrida would say[14]) this paper traces the preposition “after” (but also “before”) across a range of Derrida’s texts. If for Derrida, in Of Grammatology, the end of the book heralds an opening for the book to-come, then this paper seeks out a recalibrated futurity for the humanities which recognizes that its future will always have been its end, which, more affirmatively put, is to say that its future will have been always to begin its ending again[15]. I argue that we can find a certain dignity in what we are doing if we maintain absolute fidelity to the incalculable and unreckonable event of the university to-come, the university without condition.

Zizek’s
Living in the End Times itself struggles against the prevailing gloominess of its own pronouncements to try to imagine a world beyond catastrophe: “the present book” he writes, “is thus a book of struggle, following Paul’s surprisingly relevant definition: ‘For our struggle is not against flesh and blood, but against leaders, against authorities, against the world rulers [kosmokratoras] of this darkness, against the spiritual wickedness in the heavens’ (Ephesians 6:12). Or, translated into today’s language: ‘Our struggle is not against actual corrupt individuals, but against those in power in general, against their authority, against the global order and the ideological mystifications which sustain it’. To engage in this struggle means to endorse Badiou’s formula mieux vaut un désastre qu’un désêtre: better to take the risk and engage in fidelity to a Truth-event, even if it ends in catastrophe, than to vegetate in the eventless utilitarian-hedonist survival of what Nietzsche called ‘the last men’”[16].

Quentin Meillassoux, in
After Finitude, almost at its end, describes a “glacial world”: “a world in which there is no longer any up or down, centre or periphery, nor anything else that might make it a world designed for humans. For the first time, the world manifests itself as capable of subsisting without any of those aspects that constitute its concreteness for us”[17]. Zizek’s zero-point apocalypticism and Meillassoux’s quasi-radical-atheist[18] glaciality are matched in the apocalyptic tone of Dominic Fox’s Cold World: The Aesthetics of Dejection and the Politics of Militant Dysphoria where he describes for us a world so cold that it is “voided of both human warmth and physical comfort. The cold world is the world made strange, a world that ceased to be the ‘life-world’ in which we are usually immersed and instead stands before us in a kind of lop-sided objectivity”[19]. But Fox doesn’t simply immerse himself in this world that has “ceased to be the life-world”. He avers that what ecological disaster scenarios—we might think of the recent BP oil spill here— reveal is an “ontological mishap, a disorder in the real”[20]. Because of this ontological rupture, Fox can argue “that ‘another world’ is not only possible but inevitable, since this world cannot go on as it is (and, indeed, has in a sense already ended, inasmuch as its condition has already been diagnosed as terminal). To put it another way: not only is another world possible, but the present world is impossible[21]. Derrida, in one of his last ever texts, shared a phantasmo-oneiric dream of his, an experience of the impossible, which reveals the childlike wonder of deconstruction: “To dream, as [Ignacio] Ramonet says, that ‘another world is possible’, but to give ourselves the strength to do all that would make it actually possible. Billions of men and women in the world share this dream. Through slow and painful labor they will give birth to it some day”[22].

And, if for Fox, the disfigured world ceases to be the lifeworld, then, in both Derrida and Badiou, we find portals into other logics of world: “the grace of living for an idea” in Badiou and “learning to live finally” in Derrida. Badiou, again at the very end of a book,
Logics of Worlds, explains that, for him, “the infinite of worlds is what saves us from every finite dis-grace. Finitude, the constant harping on of our mortal being, in brief, the fear of death as the only passion—these are the bitter ingredients of democratic materialism. We overcome all this when we seize hold of the discontinuous variety of worlds and the interlacing of objects under the constantly variable regimes of their appearances. We are open to the infinity of worlds. To live is possible. Therefore, to (re)commence to live is the only thing that matters”[23]. Badiou calls this “living for an idea” and, more recently, in The Communist Hypothesis, he demonstrates how this generic procedure dehisces the apocalyptico-catastrophic-disaster rhetoric of the global financial crisis: “a radical rupture with capitalo-parliamentarianism, a politics invented at the grassroots level of the popular real, and the sovereignty of the idea: it is all there, and it will distract us from the disaster movie and remind us of our uprising”[24].

Learning to Live

In his “Exordium” to
Specters of Marx Derrida first talks about “learning to live”, this strange watchword, which keeps guard over so much that he wrote about living on, survival, inheritance, the right to philosophy, and so on (right up to the haunting last interview which has been given the title, in English, of “Learning to Live Finally”[25]). “Who would learn? From whom? To teach to live, but to whom? Will we ever know how to live and first of all what ‘to learn to live’ means? And why ‘finally’? By itself, out of context—but a context, always, remains open, thus fallible and insufficient—this watchword forms an almost unintelligible syntagm. Just how far can its idiom be translated moreover?” Derrida asks[26]. This paper ventures that learning to live ought to be thought alongside other “unintelligible syntagms” in Derrida such as “democracy to come”, “university to come”, “university without condition”, and I will add “dignity to come”. While “democracy to come”, “university without condition” and “university to come” are familiar terms in the Derridean lexicon, dignity is not, or at least it is not, yet. Dignity might be, then, what Rodolphe Gasché has called in The Tain of the Mirror a quasi-transcendental taking its place alongside différance, écriture, undecidable, aporia, and so on[27]. It does not merit an entry in Simon Morgan Wortham’s recently published The Derrida Dictionary [28]but you could say that it appears apparitionally there under the headings for gift, hospitality, decision, justice and so on. We can find, it, as J. Hillis Miller does words like “‘reste’ (‘remainder’) and ‘deuil’ (‘mourning’) in the virtual concordance of Derrida’s writings, as though we ‘were searching for it [dignity] in an electronic database of the sort that does not yet exist”[29]. Dignity, I want to say, and it might be a stretch, is a term which we can read metonymically in Derrida’s writing, a term which is faintly traceable across his remarkably consistent corpus, taking on many names, “altered” by “repetition” under his signature[30]. Dignity as a new Derridean watchword, then, one which I will return to again, at the end.

After

Derrida’s “essay in the night”
[31] on the “magisterial locution” (xvii) “I would like to learn to live finally” advances into the “unknown of that which must remain to come” (xviii). It is neither before nor after the end. Rather, learning to live, “remains to be done, it can only happen between life and death” (xviii). “Neither in life nor in death alone” (xviii). This spectrality occupies the space of a kind of half-life, which is “neither substance, nor essence, nor existence, is never present as such” (xviii). Given this non-contemporaneity of the living present with itself, inhabiting as it does the time out of joint, for Derrida, the question “where?” “where tomorrow?” “whither” always arrives, “if it arrives, [and] it questions with regard to what will come in the future to-come. Turned toward the future, going toward it, it also comes from it, it proceeds from [provient de] the future” (xix). This disjoins and disproportions the logic of before and after because “if this question, from the moment it comes to us, can clearly come only from the future (whither? Where will we go tomorrow? Where, for example, is Marxism [or medieval studies, or the humanities] going? Where are we going with it?), what stands in front of it must also precede it like its origin: before it. Even if the future is its provenance, it must be, like any provenance, absolutely and irreversibly past” (xix). Geoffrey Bennington starts off his essay “RIP” in Future: of Jacques Derrida by reminding us that:
Philosophy is a discourse that knows all about the future, or at least about its future. It knows, and has always known, that it has no future. Philosophy knows that the future is death. Philosophy is always going to die. Always has been going to die. Always will have been going to die. From the beginning, its future will have been its end: and from this end, its future will have been always to begin its ending again. Philosophy happens in this archaeo-teleo-necrological solidarity. The end of philosophy is the end of philosophy[32].
Again, this is a kind of “half-death” or “half-life”, a necro-teleology which “prevents philosophy from finding itself in its telos, which would be its finis, its death”[33]. Teleology, read as that which provides the opening for the monstrous arrivance, that which arrives but without definable telos, is “just what holds philosophy short of its end and thus gives it the possibility of a future in spite of itself ... the half-death of the philosopher would then never be completed, but would be the half-life of philosophy, which half-lives as a repeated call to reading thrown toward a future in which that reading will never be finished”[34]. Derrida’s futures [like the futures of medieval studies, of the humanities] are both “absolutely predictable and absolutely uncertain”[35]. Those futures will, let me hope here and now, always be marked, in reading, by a gratitude for opening just that futurity to reading, always elsewhere, in the past, in the before.

Before

Before, another critical watchword, is what precedes in time, with respect to a given reference point, some “now” point, be it current or not. For Lyotard and Derrida before makes sense only in correlation with the after, the “turnstile” as Geoffrey Bennington has it “of before and after pivoting on the peg of the now”
[36]. He goes on “Behind us, already past, so that we are in front of it, before it; we are before before, which has gone before us, got ahead of us, inciting us to run after it, catch it up, make up our delay, our belatedness”[37]. Before, gone before us, is still before us, yet to come. The humanities, medieval studies, the university, is still before us, here and now, still to come. Before “calls me, always ahead of me perhaps but always behind me, more or less secretly, quietly, even shiftily, and the call that comes from behind me, back of me, pushes me forward, surprised”[38]. And “Nothing can happen except from behind”, as Lyotard warns us in The Confession of Augustine[39]. The originary delay that means that we always run after before also means not only that we are always behind it, but that before is also behind us, the Other never presents itself face on, but will rather take us from behind. This dorsal perspective, as David Wills calls it[40], is precisely the “logic of the queer” that Glenn Burger and Steven Kruger describe in Queering the Middle Ages. The logic of the “preposterous” is precisely what effects a disturbance of temporal sequentiality, of before and after, or pre- and post-. “Queer theorizing, in its ‘preposterous’ revision of temporal sequence has important implications for how we think about history” because “traditional historicism is anything but preposterous; instead, it insists on straight chronologies that privilege a value-based movement of supersession and progress—classical antiquity, Dark Ages, Middle Ages, modernity; pre-, modern, and post-. The preposterous thinking of queer theory might usefully interrupt such teleological sequences”[41].

As Derrida argued in
The Post Card, since Plato and Aristotle it has been understood that “Socrates comes before [avant] Plato”. It is “an order of generations, an irreversible sequence of inheritance”[42]. But when Derrida sees a postcard of a drawing in the Bodleian library depicting Plato standing behind Socrates, it suggests that “Socrates is before, not in front of, but before Plato, therefore behind him”[43]. What comes before suggests at once what is behind (before) and what is in front (before). “What precedes us”, Derrida writes in Specters of Marx, is “as much in front of us as before us”[44]. Before and in front, avant et devant. As Sean Gaston has demonstrated, “one of the ways in which Derrida responded to the dilemma that to challenge or resist Western metaphysics one must somehow start again and recognize that it is always too late to start again was through the movement of what he calls a palintrope[45]. In Greek, palin means to move back, to go backwards, and also to do something again, to do something once more. Gaston explains that “The word is perhaps best known today as a palindrome, a word or phrase that reads the same backwards and forwards. A palindrome starts and ends the same way. But a palintrope has a slightly different rhetorical flourish, it starts differently, with a start, it startles itself as it starts again. It startles itself, and as Derrida says, loses the logos[46]. Rather than moving backwards and forwards through the same word, or over the same ground, it suggests a turning backward that happens more than once, a turning backwards that—always already—repeats, splits, doubles and exceeds itself (and catastrophe we might note, means a turning downwards). The palintrope displaces the origin and the arche.

In “Cogito and the History of Madness”, where Derrida’s work turns initially to the palintrope, he evokes the
hysteron proteron, a sixteenth-century rhetorical figure, to describe an argument or phrase that has erroneously put what should come last (hysteros, come late, be behind) in front, before or first (proteron). For Derrida, “if there is a historicity of reason in general, the history of reason cannot be a history of its origin (which already demands the historicity of reason in general), but must be that of one of its determined figures”[47]. The palintrope confuses the proper order of the before and after. As Nicholas Royle has shown, Shakespeare was preoccupied with anachronism, and the uncanniness of anachronism is at the heart of Derrida’s reading of Shakespeare[48]. As Hamlet suggests, it is all a question of looking before and after (and the university to-come is, as Derrida notes, in the eyes of its pupils): “Looking before and after, gave us not/that capability and god-like reason/ to fust in us unused” (Hamlet Act 4, Scene 4). The Norton Shakespeare editors explain that looking before and after can be read as “able to see past and future”. But, looking is not quite the same as seeing (and Derrida enjoins us to close our eyes to hear better and learn better)[49]. Also, as we have seen, for Derrida what is before can always suggest both what is behind (in the past) and what is in front (in the future). The phrase, “it is before us” can always be anachronistic, untimely. I call this indetermination of before and after the Hamlet Effect[50]. The future of the past is somehow still to come. Eileen Joy, after Cary Howie, calls this “traherence”, “the way in which every “Now is thus, simultaneously, a ‘not yet’ and a ‘then’”[51].

Philosophy too, inhabits a countertime, a time out of joint and starts, by preceding itself. Derrida tells us that “in its own proper position, philosophy presupposes. It precedes and replaces itself in its own proper thesis. It comes before itself and substitutes for itself”
[52]. The precedent, like the palintrope, in a queer logic, cannot avoid being at once behind and ahead, before and in front of its proper position, of the position of the proper. It is, Derrida says, a “pro-position” that always relies on and is replaced by a “pros-thesis”[53]. He ends “Khora” by observing that: “In order to think khora it is necessary to go back to a beginning that is older than the beginning”[54]. And, some years later, in “Provocation: Forewords” he goes on to say that “what remains to be thought: the very thing that resists thought. It resists in advance, it gets out ahead. The rest gets there ahead of thought; it remains in advance of what is called thought. For we do not know what thought is. We do not know what the word means before or outside of this resistance. It can only be determined from, in the wake of, what resists and remains thus to be thought. Thought remains to be thought”[55].

Counter

In his book
Counter-Institutions: Jacques Derrida and the Question of the University, Simon Morgan Wortham shows how Derrida was always concerned with the institutional question of the contre and traces Derrida’s vexed relationship to the academic institution and his active involvement both in arguing for the right to the teaching of philosophy against the interference of the state (something we are currently witnessing in the UK), and simultaneously arguing against the institutionalization of philosophy. We also know that Derrida passionately advocated for what Morgan Wortham calls a counter-institution: the International College of Philosophy in France[56]. The counter-institution occupies a strange position and temporality: it is neither inside nor outside the institution proper. The counter-institution tilts philosophy, the humanities, the university, toward another heading, an incalculable future. We might recall here the opening words of “The Ends of Man”:
Every philosophical colloquium necessarily has a political significance. And not only due to that which has always linked the essence of the philosophical to the essence of the political. Essential and general, this political import nevertheless burdens the a priori link between philosophy and politics, aggravates it in a way[57].
Wortham points out that for Derrida the “counter-institution” must always be “counter-institutional”, must always meddle with the institution/al. At the same time, Wortham argues that the contre, a movement with and against the institution, always resists programming, calculation, reckoning, while simultaneously demanding programming, calculation, reckoning. For Wortham, “to think the counter-institution according to the irreducible trait of the ‘other’ is to continually shift the force of the ‘counter-institutional’ so that it works, again, at an angle with or to the institution. Never the same as an institution, the counter-institution is made—in an ‘interminable process’—precisely at this angle”[58]. How does one teach, does one profess today? Wortham’s book asks (and this should remind us of the “Exordium” to Specters of Marx). This is a question that, as with learning to live, unsettles the onto-theo-logical certainties of before/after and presence/absence; learning to live and learning to teach occupy a counter-temporality which always already disarticulates the before and after. The “counter begins by coming back, promising to answer the call along the lines of a certain artifactuality or actuvirtuality, or in other words by way of complex effects of spectrality, virtuality, the as if, and the tele-effect”[59]. The question of how one teaches today begins with this artifactuality or actuvirtuality, the counter that begins by “coming back”. The past which we inherit is always open to its own future possibilities (that is to say, we inherit the opening of the past’s future possibilities, its actuvirtualities). The trace of the past, that which “comes back” and remains to come in the unanticipatable future (Derrida calls this revenance) is at once before and in front of us, behind and ahead. We try to catch it and make up our delay, our belatedness. And Wortham says about this untimely belatedness that: “Rather like ‘the political’, the counter-institution—the counter itself—seems to get delayed or held up in deconstruction, drawing its particular and peculiar force from a constitutive hesitation. But one which is similar to philosophy’s interminability or irreducibility—its ongoingness—in relation to decision or action (brought about by deconstruction). Never fully present in deconstruction’s own register, then, the ‘counter’ might nevertheless be taken to name or sign (improperly) for deconstruction, however surprizing this may seem”[60]. And, as he points out, for Derrida, the countering of what comes back is always a question of the here and now.

In
Specters of Marx Derrida puts this clearly: “differance ... does not mean only (as some people have too often believed and so naively) deferral, lateness, delay, postponement. In the incoercible différance the here-now unfurls. Without lateness, without delay, but without presence, it is the precipitation of an absolute singularity, singular because differing, precisely, and always other, binding itself necessarily to the form of the instant, in imminence and in urgency[61]. This untimeliness, belatedness, is precisely what opens us up to that which is held in reserve, remains still to come, to arrive, as promise: “the counter hides itself, but it must (belatedly?) come, with the force of a promise, or of the promissory structure of the to come ... in taking a position of any kind, therefore, the counter-institution must always negotiate with the demand to fulfill itself into action—this demand is, of course, unavoidable, just as the taking of positions will always be a necessary contingency—but it must do so by way of negotiating with that which calls or founds it, that which affirms it, which must be linked, here, to the disjunctive temporality of the “counter”. Such a negotiation is at once always already urgent and yet utterly interminable”[62].

Institution

That urgent and imminent
call which founds lies somewhere in between life and death, the actual and virtual, before and after, as an ethical and political imperative. In the “University without Condition” Derrida argues that the university should be a place of absolute, unconditional resistance where everything can be said and that this opens it up to a kind of responsibility that is not the same as that of other institutions: as an institution, Derrida claims, the university must subject the institution in general, the very institutionality of institutions, to a kind of questioning, calculating, or reckoning. “The University (and, more especially, says Derrida, the ‘Humanities’), have a responsibility to foster events of thought that cannot fail to unsettle the University in its Idea of itself. For this to happen, the special institution that the University is must open itself up to the possibility of unpredictable events (events ‘worthy of the name’ as Derrida often says, being by definition absolutely unpredictable) in a way that always might seem to threaten the very institution that it is. On this account, the University is in principle the institution that ‘lives’ the precarious chance and ruin of the institution as its very institutionality”[63]. In his later writing, Derrida used the language of autoimmunity to describe this precariousness of institutions as the very chance for their living on. As Morgan Wortham argues the counter hides within the institution not only to threaten it with its ruin but also to open it up to its very possibility, its “chance of being, as it were, alive, in the sense that life entails an openness (a ‘hospitality’, perhaps, to use another late-Derridean concept) to alterity and event, which is also an openness to the possibility of instant death and destruction (for a life that did not involve this openness would not be worthy of the name ‘life’)”[64]. Without the autoimmunization of the institution, the counter that comes back, there can be no chance of an event worthy of the name at all. In Rogues, Derrida writes that:
If an event worthy of the name is to happen, it must, beyond all mastery, affect a passivity. It must touch a vulnerability that is exposed, without absolute immunity, without indemnity, in its finitude and in a non-horizontal fashion, where it is not yet or already no longer possible to face up to, to put up a front, to the unpredictability of the other. In this respect, auto-immunity is not an absolute evil. It allows for the exposure to the other, to what is coming and to who is coming—and must therefore remain incalculable. Without auto-immunity, with absolute immunity, nothing would ever happen again. One would no longer wait, expect, expect oneself and each other, or any event at all[65].
Autoimmunity

Autoimmunity, in the last ten years or so of Derrida’s writing, joined the long list of quasi-transcendental terms that Derrida introduced, beginning with trace, archi-writing, différance, dissemination, undecidable, aporia and so on. As Bennington says this term “attempts, perhaps more clearly than some of the others, to capture a ceratin undecidability of life and death (including the ‘life’ and ‘death’ of institutions) but to do so [as Derrida puts it in
H.C. for Life, That is to Say...] on the side of life[66]. Deconstruction if there is such a thing, is as Derrida often says, essentially, on the side of life and an affirmation of life, of learning how to live (and how to teach) finally. The work of deconstruction itself takes place “in the interests of a life that would be ‘worthy of the name’ [like the event], which is a life that involves death in itself as part of its affirmation”[67].

The life of the humanities, of medieval studies, of the university, affirms itself as life just by affirming its exposure to the absolutely unpredictable event, the chance of life, just as it always might end life at any instant. Only then would the life of the humanities have any future, a future that comes from no horizon of expectation. As Bennington reminds us, Derrida increasingly “related this thought to his call for an unconditionality without
sovereignty”. Sovereignty is just the attempt at immunity that would be a kind of death through foreclosure of any possibility of event: being after the end or in the post-catastrophe is to embrace this living death of politico-institutional paralysis. If we are after the end, then nothing can happen. This is what Derrida calls on occasion the worst, where nothing happens, where there can be no trace beyond its fatal eradication. On the other hand, the unconditionality of the university without condition, the á venir, involves absolute “exposure to the unexpected event as a condition of anything like life”[68]. This is the only chance of the institutions of medieval studies and the humanities, or of the university’s living on.

Dignity

Geoffrey Bennington has recently added “dignity” to the long list of quasi-transcendentals in Derrida’s writing, seeing it as a watchword for the last fifteen years of Derrida’s work, in the context of valuing the dignity of what we do
[69]. Just like the demi-deuil or half-mourning Derrida often spoke of, Bennington talks about a demi-dignity, or half-dignity, which would be unconditional, less than sovereign, an unconditional sovereign to-come. This unconditional dignity lines up with the out-of-jointness of time in the “Exordium” to Specters of Marx and is axiomatic for the very possibility, the very chance or life of deconstruction. By exposing itself—like trace or différance—to something outside itself, dignity is an undeconstructable. It mirrors the structural endlessness of deconstruction itself in that it can never be achieved, nor is it ever finished. Dignity is, Bennington tells us, an infinite task and an ongoing responsibility.

Derrida says in the “Exordium”: “No justice ... seems possible or thinkable without the principle of some
responsibility, beyond all living present, within that which disjoins the living present, before the ghosts of those who are not yet born or who are already dead, be they victims of wars, political or other kinds of violence, nationalist, racist, colonialist, sexist, or other kinds of exterminations, victims of the oppressions of capitalist imperialism or any of the forms of totalitarianism”[70]. In the face of this depressing list, Derrida’s reparative gesture is to affirm the chance of life as he faces exposure to the unpredictable event: “Even if the future is its provenance, it must be like any provenance, absolutely and irreversibly past. “Experience” of the past as to come, the one and the other absolutely absolute, beyond all modification of any present whatever. If it is possible and if one must take it seriously, the possibility of the question, which is perhaps no longer a question and which we are calling here justice, must carry beyond present life, life as my life or our life”[71]. The university as a locus of dissensus (as Bill Readings and J. Hillis Miller have both argued for[72]), the humanities without condition, a medieval studies without condition will certainly never be achieved (“it takes place, it seeks its place wherever this unconditionality can take shape”[73]), but we must affirm their possibility here and now, in this moment of fragile institutionality, today.

In the end, then, after all I have said, some professions of faith: there can be no deconstruction without dignity, no dignity without deconstruction. And there can be no humanities without dignity, and more importantly, no dignity without the humanities. So, “take your time but be quick about it, because you do not know what awaits you”
[74].

-------------------
*This is a revised version of my keynote paper presented at the 1st Biennial Meeting of the BABEL Working Group “After the End: Medieval Studies, the Humanities, and the Post-Catastrophe” on Saturday 6 November 2010 at the University of Texas at Austin. I would like to thank Eileen Joy and Michael Johnson for inviting me and for putting together such an enlivening conference. You can read Eileen’s conference report here.


[1] Jean-François Lyotard, Soundproof Room: Malraux’s Anti-Aesthetics, trans. Robert Harvey (Stanford: Stanford University Press, 2001) 2. The Frech text which faces the English translation reads: “Il n’ya pas de fin sans commencement. Comment saurait-on que la fin était une fin si on ne le racontait pas?” (3).
[2] Antonio Negri and Michael Hardt, Commonwealth (Cambridge, MA: Harvard University Press, 2009).
[3] Slavoj Žižek, Living in the End Times (London: Verso, 2010).
[4] Rei Terada, Feeling in Theory: Emotion after the ‘Death of the Subject’ (Cambridge, MA: Harvard University Press, 2001).
[5] Cary Wolfe, What is Posthumanism? (Minneapolis: University of Minnesota Press, 2010).
[6] Andrew Parker and Janet Halley (eds) After Sex? On Writing Since Queer Theory, South Atlantic Quarterly 106.3 (2007).
[7] Thomas J.J. Altizer, The Gospel of Christian Atheism (Philadelphia: Westminster, 1966).
[8] John D. Caputo and Gianni Vattimo, After the Death of God, edited by Jeffey W. Robbins (New York: Columbia University Press, 2007).
[9] Quentin Meillassoux, After Finitude: An Essay on the Necessity of Contingency, trans. Ray Brassier (London: Continuum, 2008).
[10] Terry Eagleton, After Theory (New York: Basic Books, 2004).
[11] Colin Davis, After Poststructuralism: Reading, Stories and Theory (London: Routledge, 2004).
[12] Nicholas Royle’s wonderful book After Derrida (Manchester: Manchester University Press, 1995) appeared, of course, before his death. The special issue of Mosaic (39.3) was published in 2006.
[13] Žižek, Living in the End Times, x.
[14] I am referring here to Derrida’s essay “Some Statements and Truisms about Neo-Logisms, Newisms, Postisms, Parasitisms, and Other Small Seismisms”, trans. Anne Tomiche, in David Carroll (ed) The States of ‘Theory’: History, Art and Critical Discourse (New York: Columbia University Press), 63-95.
[15] Jacques Derrida, Of Grammatology, trans. Gayatri Chakravorty Spivak (Baltimore: Johns Hopkins University Press, 1974). See especially the section on “The End of the Book and the Beginning of Writing”, 6-26. I also have “Ellipsis” from Writing and Difference (trans. Alan Bass, Chicago: University of Chicago Press, 1978, 294-300) and “The Book to Come” from Paper Machine (trans. Rachel Bowlby, Stanford, California: Stanford University Press, 2005, 2-18) in mind here.
[16] Žižek, Living in the End Times, xv.
[17] Meillassoux, After Finitude, 115.
[18] I say quasi radical atheist because Meillassoux does believe in a virtual God. See his essay “Spectral Dilemma”, trans. Robin Mackay, Collapse IV (May 2008): 261-275. Richard Kearney’s latest book Anatheism returns to God after God to disorder the logic of before and after. He explains that the ana “signals a movement of return to what I call a primordial wager, to an inaugural instant of reckoning at the root of belief. It marks a reopening of that space where we are free to choose between faith or nonfaith. As such, anatheism is about the option of retrieved belief. It operates before as well as after the division between theism and atheism, and it makes both possible”. Richard Kearney, Anatheism (New York: Columbia University Press), 7.
[19] Dominic Fox, Cold World: The Aesthetics of Dejection and the Politics of Militant Dysphoria (Winchester, Zer0 Books, 2009), 4.
[20] Fox, Cold World, 8.
[21] Fox, Cold World, 8. My thinking has been greatly helped by Nathan Brown’s short essay “Cold, Glacial, Generic” on Meillassoux, Fox, and Badiou which can be accessed here: http://users.design.ucla.edu/~zblas/speculativeaesthetics/brown_cold-glacial-generic.pdf. And also by Elie Ayache’s important book, The Blank Swan: The End of Probability (Chichester: John Wiley & Sons, 2010), especially 125-162.
[22] Jacques Derrida, “Enlightenment Past and to Come”, Le Monde Diplomatique (November 2004).
[23] Alain Badiou, Logics of Worlds: Being and Event II, trans. Alberto Toscano (London: Continuum, 2009), 514.
[24] Alain Badiou, The Communist Hypothesis, trans. David Macey and Steven Corcoran (London: Verso, 2010), 100.
[25] Jacques Derrida, Learning to Live Finally: The Last Interview (Basingstoke: Palgrave, 2007). In French the title read “Je suis en guerre avec moi meme”/ “I am at war with myself”.
[26] Jacques Derrida, Specters of Marx: The State of the Debt, The Work of Mourning, and the New International, trans. Peggy Kamuf (London: Routledge, 1994), xvii.
[27] Rodolphe Gasché, The Tain of the Mirror: Derrida and the Philosophy of Reflection (Cambridge, MA: Harvard University Press).
[28] Simon Morgan Wortham, The Derrida Dictionary (London: Continuum, 2010).
[29] J. Hillis Miller, For Derrida (New York: Fordham University Press, 2009), xvi.
[30] Jacques Derrida and Maurizio Ferraris, A Taste for the Secret, trans. Giacomo Donis, ed. Giacomo Donis and David Webb (Cambridge: Polity, 2001), 47.
[31] Derrida, Specters of Marx, xviii. All parenthetical references in this section are to the “Exordium”.
[32] Geoffrey Bennington, “RIP”, in Richard Rand (ed), Futures: of Jacques Derrida (Stanford, California: Stanford University Press, 2001), 1-17, at 1.
[33] Bennington, “RIP”, 17.
[34] Bennington, “RIP”, 17.
[35] Bennington, “RIP”, 17.
[36] Geoffrey Bennington, “Childish Things”, in Minima Memoria: In the Wake of Jean-François Lyotard, (eds) Claire Nouvet, Zrinka Stahuljak and Kent Still (Stanford, California: Stanford University Press, 2007), 197-217, at 197. See also Margret Grebowicz’s introduction, “After Lyotard”, to her edited collection Gender after Lyotard (New York: State University of New York Press, 2007), 1-9.
[37] Bennington, “Childish Things”, 198.
[38] Bennington, “Childish Things”, 210.
[39] Jean-François Lyotard, The Confession of Augustine, trans. Richard Beardsworth (Stanford, California: Stanford University Press, 2000), 43.
[40] David Wills, Dorsality: Thinking Back Through Technology and Politics (Minneapolis: University of Minnesota Press, 2008).
[41] Glenn Burger and Steven Kruger (eds), Queering the Middle Ages (Minneapolis: University of Minnesota Press, 2001), xii.
[42] Jacques Derrida, The Post Card: From Socrates to Freud and Beyond, trans. Alan Bass (Chicago: Chicago University Press, 1987), 20. If I had the time I would track the numerous references to both “after” and the “catrastrophic” in Derrida’s “Envois”.
[43] Derrida, The Post Card, 20.
[44] Derrida, Specters of Marx, 17.
[45] Sean Gaston, Starting with Derrida (London: Continuum, 2007), vii.
[46] Gaston, vii-viii.
[47] Jacques Derrida, “Cogito and the History of Madness” in Writing and Difference, 43.
[48] Nicholas Royle, How to Read Shakespeare (London: Granta, 2005).
[49] Jacques Derrida, “The Principle of Reason: The University in the Eyes of its Pupils”, in Eyes of the University: Right to Philosophy 2, trans. Jan Plug et al (Stanford, California: Stanford University Press, 2004), 129-155.
[50] For a further reading of anachronicity and the before/after in Shakespeare’s The Winter’s Tale see my preface (co-authored with Noreen Giffney) to Queer Renaissance Historiography: Backward Gaze (eds) Vin Nardizi, Will Stockton and Stephen Guy-Bray (Farnham: Ashgate Press, 2009), ix-xi.
[51] Eileen Joy, “The Traherence of the Past”, postmedieval: a journal of medieval cultural studies 1.3 (Fall/Winter 2010): 291-298 at 293.
[52] Jacques Derrida, Glas, trans. John P. Leavey and Richard Rand (Lincoln: University of Nebraska Press, 1990), 95a.
[53] Derrida, Glas, 96a.
[54] Jacques Derrida, “Khora” in On the Name, trans. Ian McLeod (Stanford, California: Stanford University Press, 1995), 87-127, at 126.
[55] Jacques Derrida, “Provocation: Forewords” in Without Alibi, trans. Peggy Kamuf (Stanford, California: Stanford University Press, 2002) xv-xxxv, at xxxii-xxxiii.
[56] Simon Morgan Wortham, Counter-Institutions: Jacques Derrida and the Question of the University (New York: Fordham University Press, 2006).
[57] Jacques Derrida, “The Ends of Man” in Margins of Philosophy, trans. Alan Bass (Chicago: University of Chicago Press, 1990), 111.
[58] Morgan Wortham, Counter-Institutions, 18-19.
[59] Morgan Wortham, Counter-Institutions, 29.
[60] Morgan Wortham, Counter-Institutions, 32-33.
[61] Derrida, Specters of Marx, 31.
[62] Morgan Wortham, Counter-Institutions, 34.
[63] Geoffrey Bennington, “Foundations”, Textual Practice 21.2 (2007): 231-249, at 240-241.
[64] Bennington, “Foundations”, 241.
[65] Jacques Derrida, Rogues: Two Essays on Reason, trans. Pascale-Anne Brault and Michael Naas (Stanford, California: Stanford University Press, 2005), 152.
[66] Bennington, “Foundations”, 242. My parenthetical reference is to Derrida’s H.C. for life, that is to say..., trans. Laurent Milesi and Stefan Herbrechter (Stanford, California: Stanford University Press, 2006).
[67] Bennington, “Foundations”, 242.
[68] Bennington, “Foundations”, 243.
[69] Geoffrey Bennington, “Derrida’s Dignity”, keynote paper presented at the Second Derrida Today Conference, Goodenough College, London, July 21 2010.
[70] Derrida, Specters of Marx, xix.
[71] Derrida, Specters of Marx, xix-xx.
[72] Bill Readings, The University in Ruins (Cambridge, MA: Harvard University Press, 1996) and J. Hillis Miller, “The University of Dissensus” in Oxford Literary Review 17 (1995): 121-143.
[73] Jacques Derrida, “The University Without Condition” in Without Alibi, 202-237, at 236.
[74] Derrida, “The University Without Condition”, 237.

Saturday, November 27, 2010

BABEL in Boston (2012): A Letter from Kathleen Kelly

Figure 1. David Fried, Way of Words, no. 2: "Strip your psyche to the bare bones of spontaneous process, and you give yourself one chance in a thousand, to make the Pass."-- William S. Burroughs [from Fried's "Way of Words" motiongram prints series, made with long exposures of kinetic sculptures reacting to the spoken word]

Note from Eileen: for the BABEL Working Group's next biennial meeting, to be held in Boston in Fall 2012 [specific dates still to be determined], we would like as much collective and collaborative input as possible on the conference's themes and Call for Papers/Presentations/Performances. To that end, Kathleen Kelly [Northeastern Univ.] has written the open letter below.

by KATHLEEN KELLY

Dear ITM Readers:

After some brainstorming in the Austin airport with my early modernist colleague Marina Leslie, I drafted a description of a possible theme for the next BABEL meeting. Only after doing so did I see the description of the upcoming conference (“Animal, Vegetable, Mineral”) at George Washington sponsored by the Medieval and Early Modern Studies Institute—Jeffrey Cohen is the director of it, as I'm sure you know—as well as the description of the conference that the graduate students in my very own department are planning for the spring, “Raw Materials,” with Ann Stoler, The New School, as the keynote:
The quest for raw material continues to drive the exploration of both real and imaginary worlds. As scholars, it leads us to the archives, marketplaces, printers’ shops, cutting-room floors, and classrooms in which “materials” undergo processes of alteration, transformation, and manipulation—materials that could be understood as the productive elements of texts, subjects and selves, bodies, empires, and nations.

While “materiality” has held a rooted place in scholarship, we are particularly interested in examining the concept of the “raw” and “raw material.” The term itself embodies the tensions inherent in projects of creative, cultural, financial, and national enterprise. Raw material is, as Marx writes, “the fish which we catch and take from their element, water, timber which we fell in the virgin forest, and ores which we extract from their veins.” It is the matter of labor and empire, and evokes both images of creation and potential, as well as processes of destruction, exploitation, and misappropriation. We invite papers that may explore the dynamics of labor processes; that may consider the significance of raw material to creative, cultural, financial, and national projects; that may examine raw material as physical, tangible, and corporeal, as well as imaginative and ephemeral; and finally that may map the discursive processes through which the raw material of human experience is shaped, produced, exchanged and deployed.

200-word abstracts may be sent to neuegsa@gmail.com by December 20. Please include your name and university affiliation.
Apparently I have managed to be in touch with the zeitgeist and out of the loop at the same time . . . I have post-hoc precursors.

On the other hand, I remembered the work of Sherry Turkle at MIT, and what she’s been doing would fit in marvelously well with what you will read below: she is the editor of Evocative Objects: Things We Think With (“Turkle collects writings by scientists, humanists, artists, and designers that trace the power of everyday things. These essays reveal objects as emotional and intellectual companions that anchor memory, sustain relationships, and provoke new ideas”), Falling for Science: Objects in Mind (“‘This is a book about science, technology, and love,’ writes Sherry Turkle. . . . a love for science can start with a love for an object—a microscope, a modem, a mud pie, a pair of dice, a fishing rod. . . . distinguished scientists, engineers, and designers as well as twenty-five years of MIT students describe how objects encountered in childhood became part of the fabric of their scientific selves”), and The Inner History of Devices (“Memoir, clinical writings, and ethnography inform new perspectives on the experience of technology; personal stories illuminate how technology enters the inner life”)—blurbs courtesy of MIT Press. A local resource! Also, Eileen sent along this essay, "A Faustian Bargain," written by Gregory A Petsko, Professor of Biochemistry and Chemistry at Brandeis—another local resource!

So, I'm looking forward to hearing your feedback on the *draft* Call for Papers (appended below) and/or on other ideas you may have for 2012. Right now, I’d like something to shop around, especially to secure funding and one or two plenary speakers—whatever we come up with can be fleshed out later.

Best, Kathleen Kelly
Dept. of English
Northeastern University
k.kelly@neu.edu
**DRAFT**

CALL FOR PAPERS

2nd Biennial Meeting of the BABEL Working Group

Autumn 2012: Boston, MA

Objects of Study (or? Object Lessons)

Is there something perverse, if not archly insistent, about complicating things with theory? Do we really need anything like thing theory the way we need narrative theory or cultural theory, queer theory or discourse theory? Why not let things alone? Let them rest somewhere else—in the balmy elsewhere beyond theory. From there, they might offer us dry ground above those swirling accounts of the subject, some place of origin unmediated by the sign, some stable alternative to the instabilities and uncertainties, the ambiguities and anxieties, forever fetishized by theory. Something warm, then, that relieves us from the chill of dogged ideation, something concrete that relieves us from unnecessary abstraction. –Bill Brown, “Thing Theory”

The 2nd Biennial Meeting of the BABEL Working Group is interested in exploring possible collaborations between humanists and scientists. Emphasis is on the possible: not on rancorous back-and-forthing, not turf-marking, not distainable ecologies of mistrust and hostility, but on what like-minded, even eros-minded, certainly playful/serious-minded, folks in the humanities and the sciences might do together? We ask: What would collaboration between the humanities and the sciences, between this humanist and that scientist, look like? If you, yes, you, were to collaborate with someone in the sciences, what would you together (y’all, youse) devise as a project? If you, yes, you, were to collaborate with someone in the humanities, what would you together devise as a project? We anchor these questions and performances in the idea of the object—properties, relations, changes, substances, methods, data, things like object-ive and –itivy and –ion, which leads us to Latour’s distinction between object/thing, to thinginess, no-thinginess, and to the transformation of object to thing and back again.

We’ve designated four spaces within which to explore various objects of study, or, put another way, in which to locate object lessons: “Consumable Objects” (food, especially in the context of diet studies; clothing/textiles; the plastic and pictorial arts; weapons; whatever), “Ecological Objects,” “Social/Political Objects,” “Spiritual/Ethical Objects.” Participants in each session select their own objects—archive, body, cell, crux, book, encyclopedia, gene, map, matter/hyle, manuscript, memory, method, mind, museum, narrative, ontology, relic, saint, trope, word, whatever. (Note that some objects are more object-y than others: why and how?—and, given disciplinary differences, where?) Think about sessions as working groups, as demonstrations, speculations, hypotheses, and experiments that aspire to re-estrange the chosen object of study through a close reading or via any other technology either currently available or yet-to-be-imagined: distance studies, the new materialism, materialist history, a demography of things, a taxonomy of objects, science-technology studies, rhetorical readings of the embedded practices of scientists and humanists, thin and thick description—and let’s not forget the nominalists.

About six months before the conference, we’d like to select a text (primary and/or secondary) or a clutch of essays that everyone might read in order to develop a common vocabulary, a set of touchstones, for reading and conversing and realizing our Venn-diagram worlds.

Wednesday, November 24, 2010

Three Posts in One Day

by J J Cohen

But this post is the only important one.

I'm thankful for every reader of In the Middle. Whether you comment or read in silence, whether you've contacted me or remain so far an unknown friend, no matter who you are and no matter what you find of value here .... thank you for being a part of this blog.

To Do

by J J Cohen

T-Day merriment aside, I just caused myself to hyperventilate by typing out as a FB update the publishing tasks I've recently completed as well as those that remain to me. In case I choke and die tomorrow on, say, a poisoned cranberry, here is what remains in process: more balls to juggle than a thousand clowns on a thousand days with a thousand assistants could manage. Luckily, I possess 1001 clowns.
  1. I just emailed an essay called "The Promise of Monsters" to its editors. It joins "The Sex Life of Stone" and "Geographesis, or the Afterlife of Britain in Chaucer" as essays I've sent to editors within the past few weeks. 
  2. Now if I can get "The Future of the Jews of York" done by mid December, I'll have only two book reviews (Idols in the East for AHA; Monsters, Gender and Sexuality for Arthuriana) and a postmedieval essay to compose. And a tenure file to review. And a stack of essays for that postmedieval issue to edit. And another postmedieval issue to shape.
  3. Then there are the papers I am giving in Toronto (January), NYC and Iowa City (February), Chicago and Cambridge (April), and Kzoo. And Melbourne this summer.
  4. Plus the conference I'm running
  5. Oh and along with revising those other essays once they come back, I am supposed to write a new essay on medieval race by June.
You know what? A poisoned cranberry is sounding quite tasty at this moment.

Transit and travail

at the airport bound for France
by J J Cohen

Here's hoping that those of you who celebrate Thanksgiving are having good travels and restful times. If you're bored there is plenty to read here, such as Bonnie Wheeler on peer review. Look for another guest post early next week by MOR as well.

I'm spending this day before the Festival of the Basted Fowl combing over an essay and frightening myself with the amount of labor I've committed myself to in the spring. More of that anon, but first -- because T-day is all about family* -- two Cohen family vignettes.

1. Alex departed for ten days in Bordeaux last weekend. Since he's in French 3, he hads the chance to live with a native family as an exchange student. The goodbye at the airport was not satisfying: he deployed as much adolescent aloofness as a 13 year old is capable of mustering. Yet when he called from the family's house upon arrival, his voice belonged to the vulnerable boy he remains beneath all that pubescent armor.

2. As an exercise in transforming a task into well articulated directions, Katherine at school composed a recipe for cooking turkey. I reproduce it below:

INGREDIENTS
8 lbs turkey
4 pieces pizza
100 bags stuffing
1800 bags oreos
2 lbs gravy
5 boxes of gum
2 pieces of cornbread
500 pieces of red licorice

DIRECTIONS
Slice the turkey in half with a knife. With a cutting board and knife slice the pizza in half. With a fork take some stuffing and put it in the turkey. Pour the oreos in a plastic bowl. Mash the turkey with a turkey baster. Put gravy in it. Cut the two boxes in half and take the gum out. Put it as the turkey's mouth. Grate the corn bread and put it behind the turkey. Pour red licorice on serving spoon and place on turkey. Cook for two hours at fifty degrees. Bon appetit!

Katherine, I should note, has never eaten turkey in her life. Perhaps that is obvious.

*"Nothing ruins the holidays like family" is in fact one of my patented gnosticisms.

Monday, November 22, 2010

The Place of Peer Review

by BONNIE WHEELER

[we are pleased to bring to you for discussion this pre-publication section of an essay to appear in the Journal of Scholarly Publishing, "What is a Scholarly Journal? The Place of Peer Review" by  Bonnie Wheeler]

Credentialing / Peer Review
Journal publication provides several forms of credentialing. The first is simply the credit that redounds to an author by being published in a particular journal. The second is the “value added” by careful checking and editing of the article prior to publication. Each editor can list articles that might accurately cite the editor as co-author. The third is the crux: peer review. The credentialing provided by peer review is, I think, the bedrock “value added” in scholarly journals, even though we are as diverse in its definitions as we are about all other aspects of journal editing.

Academic editors think about peer review constantly.(1) Because of petitions and protests, I was alerted to the fact that several science journals still practice single-anonymous review (the reviewer remains unknown but the names of the authors are revealed). I suspect, but have no reliable empirical proof, that most of us in the humanities are accustomed to requiring standard double-anonymous peer reviews even though we are all aware, as bitter colleagues note, with Foucault in the foreground, that peer review is often viewed as “a form of censorship,” an exclusionary, disciplining mode that functions to inhibit or delay new work. I am told that the German system requires a fully open review process in which all parties are revealed to each other. There are strong reasons and feelings on all sides of these arguments.(2)

As an editor, I always feel guilty about requesting double-anonymous peer reviews, knowing that the reviewer will never be fully credited either in print or in profit for a penetrating, helpful peer review. Perhaps I am once again too naïve: do scholarly journals increasingly pay for peer review, or does peer review count in salary negotiations? (3) Probably not. I’ve recently received several complaints (one recently in a powerfully argued editorial from the Journal of Hydrology) over the past years about the increasing unwillingness of specialists to provide peer review precisely because it is “unrewarded activity.” The generations of junior faculty we “protected” from committee work and taught to concentrate on their own rewarded activity have now moved into senior positions. Though many of these colleagues have benefitted mightily from fine peer review, they aren’t necessarily acculturated to accepting reciprocal responsibility. We made ’em this way, we socialized them this way, but now we need them to adopt a different professional training and sociology and “ante up” if peer review is to survive in its current form. Some even suggest a “point system” by which reviewers would earn points they could redeem when they next request peer review for their own work.

Journals that our colleagues recognize as most prestigious will still get their attention (flattery has its uses), and many of you, my fellow editors, are skeptical that there is a problem at all. But on the evidence that I see, an occasional irritation is quickly becoming a dominant force. The pressures increase as the number of journals and thus journal submissions increase. In my experience, most schools offer some stipend when they approach you for tenure and promotion reviews; most presses reward you for manuscript reviews. But for peer review of essay submissions in the humanities, we depend upon the generosity of others.

We depend upon a commitment to our profession that transcends private gain. We used to call it collegiality. Collegial peer review, in all its manifestations, depends upon a fading notion of noblesse oblige. Freely produced peer review may be one of the last “class” vestiges we retain in our profession, an appendix remaining from the positive side of the otherwise punitive (to outsiders) Old Boys’ Network. Yet, in revised formats, it now might provide a transparent activity that reflects one’s obligation to aid the development of vivid intellectual work in one’s field.

How can we remake our systems to interest our young in participating in the credentialing process? In this issue of JSP, James J. O’Donnell suggests that journals of academic societies change their mission:
The traditional publishing peer review ... is seriously flawed and depends too heavily on the willingness of scholars to participate. What vehicles can be found for bringing together scholars around a common purpose and exposing what they do to evaluation of its own? ... Could we imagine a [learned] society that suspended publication of a journal per se but focused on a panel of reviewers who evaluated and recognized the best articles published by members wherever they should appear? A society's web page that contained links to twenty of the best articles published this year by its members might well do more for the discipline than one that spent proportionally more effort on soliciting and editing twenty articles of its own.”(4)
This is an attractive but exclusionary tactic: the notion of “twenty best” blockades the intellectual opulence and openness that our generation of academics has worked hard to achieve.

NIH grant directives insist upon and provide funding for peer review in the sciences. The formal assessment of proposed research begins, according to the NIH, “at a sufficiently early stage to influence the course of that research, the nature of its outputs, and ultimately even whether it takes place at all (or is made available to a wider audience).” In humanities publications, peer review typically takes place at a medial point, when an author or authors submit to a journal an essay that has been shaped to the point that it is considered “ready for publication.” Kathleen Fitzpatrick, whose rich intervention in this question I mentioned earlier, uses an e-publication format that embodies the desire for credentialing that she advocates—a kind of 1960s intellectual e-commune in which everyone participates in the work of production and of evaluation. In her complex analysis of peer review, especially on-line credentialing, she models a new best practice: open peer review, in which scholars post the penultimate version of their work to a website that invites responses. Those responses are open in every sense: the reviewers are known (and thus can be credited), the author can respond, and the final “product” in hard or e-copy acknowledges all participants in the process. She argues that:
the time has come for us to consider whether, really, we might all be better served by separating the question of credentialing from the publishing process, by allowing everything through the gate, and by designing a post- publication peer review process that focuses on how a scholarly text should be received rather than whether it should be out there in the first place. What if peer review ... became peer-to-peer review?
Fitzpatrick’s process may become a new norm, though it is notable that her work, posted now for many months, has not received heavy e-commentary. So far, like our CELJ blog,few are answering her call for comments, contradictions, and suggestions. Her book is slated for hardcopy publication, after which it will presumably receive the prominence it deserves in book reviews.

Whether peer review will continue to fulfill its highest goal of aiding authors to achieve the originality, clarity, and authority they seek depends upon what individual and collective roles we play as editors and publishers. How much of this will remain under editorial control is an open question. But among all elements of scholarly editing, peer review is the most likely to persist in some form. At its best, this form of credentialing extends our mission as teachers to authors who are our most intensely interested audiences. Most fearfully, however, credentialing suits larger political mandates for evaluation and control.

Bonnie Wheeler, Director of the Medieval Studies Program at Southern Methodist University, is executive editor of Arthuriana and past president of CELJ, for whom she collected and edited the essays in this issue of JSP.
--------
(1) For thoughtful studies of peer review, see David Shatz, (Lanham, Md: Rowman & Littlefield, 2004) and Kathleen Fitzpatrick, Planned Obsolescence: Publishing, Technology, and the Future of the Academy, on-line pre-print.

(2) Thatcher comments: “I would hazard the generalization that peer review of monographs carried out by university presses generally fits the single-anonymous review model, though reviewers are always given the option of revealing their identities to the authors.”

(3) The book world, in general, operates differently, and pays at least a nominal amount to reviewers, though I’ve been watching recently as some mega-corporations have begun to compensate journal reviewers with come-ons worthy of carnival hawkers: see Literature Compass.

Saturday, November 20, 2010

NCS Portland: Panel Proposal Deadline Looms, and a Plea

by J J Cohen

Don't forget that the deadline for proposing a panel for NCS 2012 is Monday November 22. By then your mind will already be elsewhere: grading, Thanksgiving, grading, end of the semester, grading. So don't ruin Monday for your future self. Ruin a portion of the weekend instead, and send your panel proposal now.

You know that I am especially keen to have some provocative proposals on two related threads, Oceans and Ecologies. Full details here. I have some excellent proposals for each, but not enough for either. Ecologies is especially slim in its yield. I'd like 2012 to be the year remembered when the lives of nonhuman organic and inorganic things and forces dramatically entered the wider consciousness of medieval studies, but I can't pull that off without some panel proposals to get the conversation started. Remember, you don't need to line up anyone to speak in your panel (a call for papers this spring will do that work for you), and anyone can suggest a panel. You don't need to have already published work in the area; an interest in instigation is all it takes. Feel free to email me if you have any questions.

Friday, November 19, 2010

On being a digital medievalist

by Martin Foys and Asa Simon Mittman


[We are grateful to MF and ASM for composing this guest post, a process that gave us a dizzying view of dynamic electronic collaboration in action: they collaborated via Google docs and FB simultaneously, and allowed us to watch the process unfold. -- JJC]

The Machine in the Ghost: reactions to being (not, not at all, not by a long shot) a digital medievalist rock star
Martin Foys

Yesterday I was fortunate enough to be included in a New York Times article about scholarly efforts that are now considered the digital humanities. I ended up in the Times by being in the right place at the right time - literally the right place, as living twenty miles outside of New York afforded the paper an easy photo-op. The whole experience so far has been twenty-four hours of incongruous weirdness. Friends I haven't heard from in years have emailed me congratulations, the president of my institution wants to have lunch with me, on his birthday no less; a company in Tampa named "In the News" called and asked if I wanted the article mounted on a plaque - one of three such offers I received in the space of a morning.

It's also been a crystalizing, bracing opportunity to think about what, exactly digital humanities means for me, and why I am doing the work I am. I spent part of my day fruitlessly defining and defending on Facebook my own sense of this work to a medievalist acquaintance who wrote to me a litany of protests like "‎I think this mapping stuff (turning text into data and into an image) is, well, pretty much without consequence, personally."

Newspaper articles are designed to tell you difficult things easily, and by their nature misquote you, even when they print word-for-word a sentence you actually said. The Times article is a fine attempt to talk about some of the work being done in the so-called Humanities 2.0, but such pieces inevitably flatten much of what is exciting here, at least to me. What I didn't realize, though, was how reporting about some humanities scholars’ desire to focus on digital modes of data and visualization of information would unintentionally catalyze cries of reductionism and the de-humani(ties)zation of our disciplines. The nearly one hundred comments to the article (I know - why did I read them?) are dominated by harsh runs of posts rejecting the significance and relevance of data in the humanities, and at times the humanities in general (though there was an edifying, late rally of support for such work before comments closed at the end of the day). But at least only one particularly frustrated reader went as far to call the academics profiled in the piece "pseudo-scholars" (I’m looking at you, #43, Thomas Clarke, from Phoenix Ariz.).

I am deeply grateful for such sentiments; they remind me of the necessity of living beyond the work you do, and what you think it does, and your like-minded peers - beyond the "horizontal modeling" of your intellect (a phrase itself is stolen un-horizontally from an English professor whose arguments I find profoundly misguided: Mark Bauerlein, author of The Dumbest Generation). I thought I'd take a little space here to work out in a little more (if still brief and woefully unspecific) detail what this kind of work means to me, and why I find it important. Most of my own early work in applying digital technologies to medieval studies was of a remediating sort - in the 1990's, while designing the now, already, antique Bayeux Tapestry Digital Edition, I hadn't yet worked out that technology didn't allow you to do things better, faster, stronger (a.k.a. the Bionic Man hermeneutic), but instead differently. Around that time, I was at a series of Kalamazoo sessions on digital resources for medieval studies (at the time, mostly new web sites or giant, funded catalogue and/or database initiatives) and Dan O'Donnell (soon to become the founder of Digital Medievalist) asked a question that has haunted me ever since. During the q&a, Dan said something like, "Well, okay, is all this just improving what we already do, or is actually changing what we do?" Sitting there, I was appalled - not at the question, but at the answer I myself had - which was "no, not really."

Where, exactly, was the brave new world?


Writing Virtually Anglo-Saxon was, in part, my attempt to work out theoretically some of the possible valencies of new media for medieval study. But it wasn't until about five years later that the a-ha moment came for me regarding digital praxis in medieval studies. At a 2003 conference about "New Technologies and Old Texts”, I saw a presentation by the HUMI Project, based at Keio University. The Japanese have a knack for bringing it in ways you do not expect. Like Pocky (seriously, have you seen the design aesthetic here?). HUMI was at the time analyzing the typographic output of Caxton's printing house, and computer generating histograms of individual printed letter characters from different books. They were doing this in part to trace evidence of decay of the typeset being used across Caxton's print runs. In effect, though, they were also rebuilding from printed words on the page a granular history of the objects which produced them. This is where the penny dropped, as I realized that computers had the ability to read those words differently than we instinctively could or would -- a reverse engineering of what I later learned was transcoding, or the ability of new media to read machine code, but simultaneously reformulate it into forms of signification readable by human users. For the computer, the word on the page encoded far more than we could physically see, calculate, or unaided understand.

"Calculate" is the key word here. The brain needs help to do its scholarly work. We've had technology assisting us for longer than we care to remember - indeed, we've forgotten, interiorized, as Ong puts it, the technology; we’ve embedded the glasses, the pen, the book, and gradually now the computer deep within us - becoming unwitting cyborgs in the process. A number of reader comments in the Times piece took pains to differentiate between the tools and the transformational rhetoric the article espoused, saying, in summary, They're just tools - they don't do anything - the scholar still needs to interpret the data, or, as Anthony Grafton states in the article, “It’s easy to forget the digital media are means and not ends." Sure, fine, yep. But since when haven't tools transformed the human? The old-school cybernetic turn that tools are the extensions of "man" was outmoded at the moment that Norbert Wiener (n.b. best computer geek name ever) began to formulate it in the late 1940's, even by his own admission. The point of tools and humans is not that they augment the human, or technologically determine the human, but rather the constant feedback loop existing between the tool and the human - a kind of cultural technetics of a circulating, hybridized, slow dance of mutual prosthesis. In digital scholarship, we are beginning to generate data that we before did not have the capacity to imagine, both in quantity (the economies of scale are already becoming staggering), but in quality as well. So what happens, to give one emergent example from our own work in Digital Mappaemundi, when we can analyze the coordinate-coded proximity of related images, inscriptions and topographical features across a wide corpus of medieval maps and geographic texts? We can’t wait to find out, so we’re building the tools to do so, but it’s going to be a slow process of discovery.

This is the necessary symbiosis between flesh and machine. One thing I've learned this semester while team-teaching an undergraduate digital humanities class with my computer-scientist collaborator Shannon Bradshaw, is that computers are great at things that humans are not, and vice-versa. Our mental ability to calculate complex equations is rudimentary at best - a dollar-store calculator outperforms us. But relatively speaking, computing technology is still ghastly when it comes to such things as linguistic facility, or recognizing "image affect" - the capacity of visuals to generate emotion. Fascinating. Numbers, words and pictures - these are what drive our hyperreal world after all, and our interpretation of it; we need machines to produce them, and machines to understand them. One manifest utility of digital technology is that it forces us to admit how porous the boundaries are between such signifying ingredients. We're already quite accomplished at turning everything about our world into words - we've had centuries of scholarly practice - but what happens when we turn words back into objects, or into images, or turn images into numbers?

There's much more that could be discussed, of course - especially the communal and collaborative qualities of new media discourse that are also ardently post-human, but I’ve written enough, and I think Asa’s going to hit this aspect a bit. (Oh, okay - one quick anecdote: I've taught Foucault's "What is an Author?" in my Intro to Lit Crit class for the better part of two decades now. This year was the first year where I actually felt the author function re-historicizing in modes analogous to medieval corporate production, as students brought up Wikipedia and other digital texts as examples of authoritative, yet authorless discourse. It was pretty nifty).

The work being done by the scholars mentioned in the Times article, and the intense industry of the other “geek/poets” that we there stand in for as convenient body doubles, is not revolutionary, but revelatory. The article is not, as some have (mis)taken it, meant to indicate that now is the time when the new, exciting digital work is being done (it has been for decades), or that now is the time that changes the game. Change is always. The article is simply a signpost along the way of a slow, inexorable tipping point of the grinding shift to that which comes next. Anglo-Saxon England didn't end in 1066, despite all the tumult and arguably sudden upsets to the ruling structure. Gradually, later, it just was no longer, well, really what it was before, and had become something different. Mostly. Welcome to the humanities . . . again.


Always was a rockstar (in my mind)...
Asa Simon Mittman

I get a message from Martin, subject line “Dude, you're in the Times.” Confusion. Nothing in the message but a link. I click it. The New York Times. Heart begins to pound. When I wrote to friends and colleagues that this was a life-goal achieved, I pretended that I said it with my tongue in my cheek. What had followed has been, as Martin notes, bizarre. I have been contacted by a curator with an exhibition he sees as relevant, by a publisher (no, no new book deal will ensue), by other digital humanities folks, friends from afar, and by a prof from right here at Chico State who first heard about me by reading the article in the Times. (The Times! Ok, I’m ok.) He is full-time in another department, but serving as a visiting prof in mine, and we’ve never met; he first heard about me in this article. Social networks.

The article does a good job, I think, of presenting some of the newer ideas and efforts (only a tiny sample) in the Digital Humanities, and while not in-depth, it is not nearly as reductive as some of the commentators claim. Like Martin, I read the comments. Like him, I was struck by many, especially one from “Swagato C” (Chicago, IL), who writes:
I am not so sure that 'crowdsourcing' academic ventures is the way forward. Could Kant have written his Critiques by a collaborative effort across continents and schools of thought? At some point the individual still stands apart. The tools, yes, they must indeed be refreshed as necessary.
Martin does a good job here (and an excellent job in Virtually Anglo-Saxon) of pointing out that all technology is a mediation, even (perhaps especially) those technologies that we fail to see because, in our post-human status, they seem mere extensions of ourselves. The pen giving way to the keyboard, vellum being overtaken by paper, these technologies effect what we write, how we write, how we house, organize, and sort that information, and how we find and read it. But Swagato is perhaps (wilfully) blind to the historical processes that produced Kant’s Critiques. This is the standard romanticizing of the writing process as the work of the lone scholar, toiling in isolation in a rickety garret. Preferably on a mechanical typewriter.
And, if possible, the author’s world should become black-and-white. All the better if he (always he in this scenario) has a glass of whiskey or a bottle of absinthe to serve to provide the inspiration.

But this has no more foothold in reality than Jerry Lewis has a typewriter. It is part of a grand narrative about how Great Works come into being (no less prevalent in my field of Art History than in Literature or Philosophy). Never mind that we are not suggesting that we should crowdsource all academic ventures, Kant did not write in a vacuum. He was responding to and building on a mass of philosophical thought. Good work occurs in dialogue, out of myriad contacts and readings and replies. Indeed, the curator who contacted me has put together an exhibition on the Republic of Letters (the subject of a very cool digital humanities project out of Stanford, my alma mater, which is doing a lot of good work in the field), and I cannot help but think that those folks would have loved the Internet.

So, really, there is no question about the value of collaboration, is there? This then brings us to the data portion of the argument, which has two prongs:
1. Human passion is the only way to access the humanities (Julia, Hiawassee, GA):
How can the Arts be "quantified"? The Great God Data is, I feel, anathema to creativity and the human soul. Technology may have proven itself valuable in the fields of science and economics, but, please leave the worlds of literature and art alone! To quote Descartes: "The heart has its reason that reason does not know." As an artist and a humanist, "j'accuse".
Again, we are with the Romantics. This notion would have been fairly alien to artists of many periods, including the “artists” -- really, better termed illuminators -- who made the maps that we have been focusing on, many of whom were quite interested in data. Take the Saint John’s College MS 17 map, for example (the map from which is at our site, but the whole of which is available in a great site headed by Faith Wallis of McGill). The map is housed in a manuscript that begins with a geometry problem (though this is a later addition), and contains, among other things:
Materials related to computus … grammar of numbers; calendar prognostications; note on world-ages; how to calculate embolisms; lunaria; horoscope; Coena Cypriani; riddles; runic, cryptographic and exotic alphabets; demonstration of cryptography; diagrams showing Easter termini, times of sunrises and sunsets; world-map; table of feriae; tables and texts on degrees of consanguinity; taxonomies of knowledge; Byrhtferth's Diagram.
Note the world map as merely one bit of the very technical, mathematical, data-filled context of this manuscript. The “human soul” argument sounds expansive and grand, but is, in truth, historically embedded and quite limiting.
2. Instead of crunching the data, we should just use digital tools as virtual archives, which will help us go about our (already, though unacknowledgedly, mediated) work as we always have. John Kleeberg (New York, NY) voices this clearly:
This research could easily end up being pointless. Contrast that to the vital importance of scanning and digitizing our libraries and archives so that they can be preserved and made them available to all. That way, the next time an Anna Amalia library burns up, or a Cologne archive collapses (and we know for certain that there will be a next time), at least the information will be preserved in digital form.
In fact, Martin, Shannon and I ran into exactly this response from the Kress Foundation. When we applied to a new funding program for electronic work, they told us, “while the Kress Foundation appreciates the significance of the Digital Mappaemundi project, our Digital Resources initiative is new and its early focus will be on what we are calling ‘destination’ photo archives … The Kress Foundation commends the Digital Mappaemundi project’s goals and we are certain that it will indeed provide greater access to scholarly records of the mappaemundi.” It would be such a dreadful shame if computer scientists went through all the trouble of creating these stunning abilities and all we did was use them as filing cabinets, rather than letting them do the sorts of work that, as Martin says, humans just aren’t good at.

As 2010 wanes, I’ve been thinking a lot about a TV show I used to watch as a kid, Beyond 2000. It promised me flying cars and jet packs (and I am somewhat pissed that I don’t have either, yet. Just a 1999 Subaru with its “Check Engine” light on). But the writers of the show missed all the big stuff, and they didn’t see the value of humanities computing. We are not “just getting started,” or anything like that. We’ve been at it for ages. Bob Stevik converted Beowulf to punch cards. In 1967. I do agree with Keeberg, some of “this research could easily end up being pointless.” Of course it could. Thank goodness that was never the case with print-based scholarship. Or quill based.