So if you're not quite ready to take the Twitter plunge, and prefer your tech to be old fashioned like rotary phones and VCRs, two blogs to add to your subscription list are the wonderful site Allan Mitchell has created for his seminar Becoming Human, and Elaine Treherne's History of Text Technologies. In case you missed it, Anne Harris's Medieval Meets World is also terrific (and has been around for quite some time).
I'm being tongue in cheek, of course: blogs are not old tech so much as a comfortable expanse within our current scholarly landscape. If they don't seem especially new any more, that doesn't mean they are any less useful. Or inspiring.
And speaking of inspiration, my wonderful colleague Holly Dugan sent out a series of tweets last night that exactly get at one of the promises inherent in digital humanities, including blogs. She wrote:
I like both these observations because they reveal another change in the way we conceptualize and disseminate scholarship: in a wired world, patiently waiting for conventional print to do its work is an option (as is watching coral reefs grow at one centimeter per year), but not necessarily the best option. We need to enhance its agency. No one likes over-the-top self promotion, and we can all spot obnoxious or arrogant horn tooting when we hear its blare. But there is nothing wrong with being a firm advocate for the scholarship you have accomplished and for the expertise that you possess. There is nothing wrong with bringing your research skill to as wide an audience as possible. If you have labored over figuring out a problem or a context, if you have worked to possess a knowledge about an issue or text, then being humble and awaiting the reader who will find your insight buried in a $90 book or within a paywall guarded journal might not be the best method for instigating the conversations that are in fact the way our work lives, breathes and changes. A scholar's work is at its midpoint once something appears in print or electronically: that isn't the time to walk away and see what happens from afar. Don't we teach our undergraduates that no question is ever fully answered, that no project is ever really done? Shouldn't we take responsibility for the (potentially change-filled) future of our work rather than think that at a certain point it is petrified, inert?#Altac tweets from#mla12 emphasize author responsibility to promote and disseminate, not just produce and research
Their point was about new modes of publishing and new platforms, but the take away also resonated with me about gender and the profession.
It's a daunting challenge, isn't it, to be responsible not only for ushering your work into conventional print but then to nurture its life after it appears. Many scholars won't want to do so (and that is OK, honestly: sometimes you are so tired of a project that you need to walk away after its release, at least for a while); many others lack the technological savvy to be their own best advocate. Training in DH needs, at a minimum, to be part of the graduate curriculum. It starts on day one -- or, better, as Ryan Cordell has written, it starts as part of undergraduate humanities training.
Right?
In a bizarre moment of synchronicity, I ran into a colleague of mine who is our dept.'s digital humanist [she works on early transatlantic book history] at a coffee shop in Edwardsville this morning, where I was sitting and working on a blog post that is related to Jeffrey's earlier post on tweeting the MLA, and we were talking about that plus debates over the digital humanities sparked by the recent MLA meeting, and she urged me to read the guest post at ProfHacker's blog, "It Starts On Day One," that Jeffrey links to here, and then I clicked over to In The Middle while talking to her, and there it was. HA!
ReplyDeleteSo now I've read it and it's wonderful, but I do have this to say, and I think it's important: why do we often proceed in these matters as if it's this VERSUS that, or this follows AFTER that [that was THEN, this is NOW], etc. Yes, as this piece argues, we should definitely better prepare our students to become
"a generation of knowledge workers prepared not only to teach, research, and communicate in 21st-century modes, but to govern 21st-century institutions,"
and this will mean all sorts of practical education in new technologies and new scholarly-delivery platforms and extra-curricular experiments, etc. At my university, I am proud to say that 2 of my colleagues [one a computational linguist and the other a historian of the book, mentioned above] recently founded an informatics and digital research lab in my College [with scant funding, I might add, but with a lot of energy and willingness to write tons of grant applications] and they are now crafting, across quite a few departments within and beyond the College of Arts and Sciences, a new interdisciplinary minor studies program in humanities informatics and digital scholarship. They have a long road ahead of them in this venture, but I'm excited about it and about all of the ways I can join forces with them relative to the projects of BABEL and punctum books and beyond.
But having said that, I don't understand why we have to frame this conversation, as Bethany Nowviskie does at ProfHacker's Chronicle blog, as having to choose between spending time working on courses in "academic jargon and en-vogue theories" [hello? prejudicial and unfair description of a HUGE part of what humanities scholars do = thinking, anyone?] and developing courses that will help students to become masters of certain technologies that will help them secure a better future in an increasingly technologized university *and* world? Why is it either/or? If I work in literary studies, I want to think and write about literature, and I'm happy to utilize any technologies at my disposal [older and newer]. I'm a positivist; I'm for anything that increases our powers of thinking and doing, but let each one of us determine what that thinking and doing might be for each one of us, and let's work hard *together* to provide the spaces and means for this proliferation of ways of thinking. As anyone who knows me knows: I'm for blogging, I'm for open-access publishing, for open peer review, and alt-ac careers, but what I am NOT for is anyone telling me what I should NOT be for or that I have to choose between this and that, or that everything in the past, or how we are doing or have done things must be wrong. It's about having choices and maximizing choices and personal freedom and the ability of the greatest number of persons to achieve what they desire, via new *or* older forms of technologies and methodologies and modes of thought. That's why punctum books will keep making printed books while also making e-reader versions of those same books and also free, downloadable DIY versions of those same books. Because each of those things serve distinctly unique and idiosyncratically diverse desires of different persons in different places who each want to get their knowledge/reading pleasures in different ways, and maybe multiple ways simultaneously.
So, this is also just my way of saying I actually think Nowviskie's asteroid/scorched earth metaphor for revamping core curricula of grad. programs in the humanities is just . . . ABOMINABLE. If you were to get rid of, say, the core theory class in an M.A. in Literature program, and replaced it with a course in, say, designing machines that speed-read medieval manuscripts or producing e-books or whatever, what do you do with that technology minus theory? There is STILL theory, right? There is still literature, right [if you're a literary studies scholar]? It's not one OR the other. It's both. I'm not going to tell my students, you can stop reading Foucault, or Graham Harman, because now we're going to build things. Build what, again? There's building a new "delivery platform" with amazing abilities to "disseminate" things [cool!], but disseminate what, again? Most academics I know who use Twitter use it [when not making jokes or saying, "I just ate a donut," etc.] to talk about what they're reading, thinking, etc. The university, whatever its delivery platforms and technologies of dissemination, is still a place where, as Bill Readings once, wrote, we place thought alongside thought and see what happens. The criticism that literary theory is "en vogue" is the same criticism that some literary theorists level against digital humanists. Can everyone please stop doing this? Okay.
ReplyDeleteYes, it was over-argued: baby and bathwater, instead of group bathing. Or something like that. The past is not weeds and dinosaurs -- and one of the things I like about DH (and a reason that it is so attractive to medievalists and early modernists) is that it is NOT merely presentist, or even progressivist, but looks at technologies over VERY long durations, and ruminates on the simultaneous presence of multiple, historical technologies at any moments.
ReplyDeleteSo, the point is a good one: yes, grads and undergrads in the humanities need to understand DH. But that doesn't mean that don't also need to know codicology, philology, feminism ...
Submitted on behalf of Joseph Kugelmass, who has had trouble with the captcha codes. Has anyone else had a problem?
ReplyDelete------
Your post is completely right on. Long live blogging and micro-blogging. On the other hand, we should try to bear certain problems in mind. Looking again at that quote, I'm struck by the blatant preaching to the choir: "#Altac tweets from #mla12 emphasize author responsibility to promote and disseminate, not just produce and research" Honestly, I just don't think good discussions about ethics emerge from this kind of affirmational "I must always remember to do this" approach.