4 min read
An academic friend wrote to me recently to tell me of his new appointment at a university center that is focused on the "digital humanities." I congratulated him -- previously he had been able to secure only one-year fellowships -- but then I asked him what the digital humanities are. It's a phrase I have heard thrown around by academics for the past couple of years. Sometimes it refers to the ways in which articles and books are published and then revised and discussed. It's a way of describing how the Internet has made us all collaborators, how publications are now understood and revised in real time rather than just being printed on dead trees and left, for . . . well, dead. But the digital humanities also involve using computers to elucidate our understanding of the humanities.

I am guessing that the vast majority of well educated non-academics have no idea what the digital humanities are. And maybe that's not such a terrible thing. There are many topics in the hard sciences that well-educated non-academics don't understand either. But I confess to being a little surprised by a piece from the Chronicle's career advice columnist this morning called "No DH, No Interview," which suggests that humanities scholars who lack expertise in this field will have a tougher time on the job market.

I don't propose to get into a discussion here of how worthwhile the digital humanities is as a field. The author of the piece, William Pannapacker, spent some time this summer using technology to "create a map of the Lake District as explored by Wordsworth and Coleridge." Interesting enough, I guess. But it is useful to look at how quickly the digital humanities went from a field no one had heard of, to one that is the new hot thing, to one that is now indispensable to getting a job in academia.

In a column earlier this year, Stanley Fish explored  the theory behind the digital humanities. In order for a new academic discipline to gain legitimacy, at least in the social sciences or the humanities, it must juxtapose itself against the myriad other academic modes of interpretation out there. Fish cites a new book by Kathleen Fitzpatrick called Planned Obsolescence: Publishing, Technology, and the Future of the Academy to explain how the digital humanities scholars see their work. Fish writes:

The effect of these technologies is to transform a hitherto linear experience -- a lone reader facing a stable text provided by an author who dictates the shape of reading by doling out information in a sequence he controls -- into a multidirectional experience in which voices (and images) enter, interact and proliferate in ways that decenter the authority of the author who becomes just another participant. Again Fitzpatrick: “we need to think less about completed products and more about text in process; less about individual authorship and more about collaboration; less about originality and more about remix; less about ownership and more about sharing.”

Fish, who has his own attachments to concepts like "individual authorship," is rather critical of the new discipline:

“Text in process” is a bit of an oxymoron: for if the process is not occurring with an eye toward the emergence of a finished artifact but with an eye toward its own elaboration and complication -- more links, more voices, more commentary -- the notion of “text” loses its coherence; there is no longer any text to point to because it “exists” only in a state of perpetual alteration: “Digital text is, above all, malleable . . . there is little sense in attempting to replicate the permanence of print [itself an illusion, according to the digital vision] in a medium whose chief value is change.” (Fitzpatrick)

But all of this theory does have a practical application. Getting jobs. As one academic career counselor told Pannapacker, "any candidate who can add an expertise in DH to their conventional profile is going to be noticed. Departments realize they need to include some DH expertise, but most senior scholars have no time or inclination to achieve that, and are counting on 'hip new hires' to carry the flag."

So it seems like the same old story. Find a new theory, make it seem cutting edge, get young scholars indoctrinated, then make sure that fashionable academic departments can't do without one of these scholars. Then turn everyone into a scholar in this mold.

Well, maybe there is some hope.

Laura Mandell, director of the Initiative for Digital Humanities, Media, and Culture at Texas A&M University, recently compared the rise of the digital humanities with the rise of "critical theory." According to Pannapacker:

Mandell said DH is partly a turn against the dominance of critical theory, which she called "a PR failure and an intellectual failure: an excessive and unexamined lock-step discipline." DH provides a rigorous alternative to the seemingly exhausted scholarly approaches of the previous generation. Moreover, DH is a culture of building projects that serve a wide audience rather than -- to paraphrase Mandell -- engage in knee-jerk denunciations of capitalism while depending on its dwindling largess for our employment.

So this is an interesting (and perhaps positive development). There are academics out there who realized that critical theory was a "PR and an intellectual failure." Certainly it seemed this way to vast numbers of academic outsiders but it's nice to see an insider recognize it. And then there's this whole idea of serving a wider audience with scholarship and not engaging in knee-jerk denunciations of capitalism. Talk about cutting edge!


Leave a Reply

Your email address will not be published. Required fields are marked *