Building Research Capacity Across the Humanities

On Monday I gave a talk at the German Institute for International Educational Research (DIPF) on:

Building Research Capacity Across the Humanities and Social Sciences: Social Innovation, Community Engagement and Citizen Science

The talk began with the sorry state of public support for the humanities. We frequently read how students shouldn’t major in the humanities because there are no jobs and we worry about dropping enrolments. The social contract between our publics (whose taxes pay for public universities) and the humanities seems broken or forgotten. We need to imagine how to re-engage the local and international communities interested in what we do. To that end I proposed that we:

  • We need to know ourselves better so we can better present our work to the community. It is difficult in a university like the University of Alberta to know what research and teaching is happening in the social sciences and humanities. We are spread out over 10 different faculties and don’t maintain any sort of shared research presence.
  • We need to learn to listen to the research needs of the local community and to collaborate with the community researchers who are working on these problems. How many people in the university know what the mayor’s priorities are? Who bothers to connect the research needs of the local community to the incredible capacity of our university? How do we collaborate and support the applied researchers who typically do the work identified by major stakeholders like the city. Institutes like the Kule Institute can help document the research agenda of major community stakeholders and then connect university and community researchers to solve them.
  • We need to learn to connect through the internet to communities of interest. Everything we study is of interest to amateurs if we bother to involve them. Crowdsourcing or “citizen science” techniques can bring amateurs into research in a way that engages them and enriches our projects.

In all three of these areas I described projects that are trying to better connect humanities research with our publics. In particular I showed various crowdsourcing projects in the humanities ending with the work we are now doing through the Text Mining the Novel project to imagine ways to crowdsource the tagging of social networks in literature.

One point that resonated with the audience at DIPF was around the types of relationships we need to develop with our publics. I argued that we have to learn to co-create research projects rather than “trickle down” results. We need to develop questions, methods and answers together with community researchers rather think that do the “real” research and then trickle results down to the community. This means learning new and humble ways of doing research.

HathiTrust Research Center Awards Three ACS Projects

A Advanced Collaborative Support project that I was part of was funded, see HathiTrust Research Center Awards Three ACS Projects. Our project, called The Trace of Theory, sets out to first see if we can identify subsets of the HathiTrust volumes that are “theoretical” and then study try to track “theory” through these subsets.

The future of the book: An essay from The Economist

Coverr

The Economist has a nice essay on The future of the book. (Thanks to Lynne for sending this along.) The essay has three interfaces:

  • A listening interface
  • A remediated book interface where you can flip pages
  • A scrolling interface

As much as we have moved beyond skeuomorphic interfaces that carry over design cues from older objects, the book interface is actually attractive. It suits the topic, which is captured in the title of the essay, “From Papyrus to Pixels: The Digital Transformation Has Only Just Begun.”

The content of the essay looks at how books have been remediated over time (from scroll to print) and then discusses the current shifts to ebooks. It points out that the ebook market is not like the digital music market. People still like print books and they don’t like to pick them apart like they do albums. The essay is particularly interesting on the self-publishing phenomenon and how authors are bypassing publishers and stores by publishing through Amazon.

eBookdata

The last chapter talks about audio books, one of the formats of the essay itself, and other formats (like treadmill forms that flash words at speed). This is where they get to the “transformation that has only just begun.”

We Have Never Been Digital

Historian of technology Thomas Haigh has written a nice reflection on the intersection of computing and the humanities, We Have Never Been Digital (PDF) (Communications of the ACM, 57:9, Sept 2014, 24-28). He gives a nice tour of the history of the idea that computers are revolutionary starting with Berkeley’s 1949 Giant Brains: Or Machines That Think. He talks about the shift to the “digital” locating it in the launch of Wired, Stewart Brand and Negroponte’s Being Digital. He rightly points out that the digital is not evenly distributed and that it has a material and analogue basis. Just as Latour argued that we have never been (entirely) modern, Haigh points out that we have never been and never will be entirely digital.

This leads to a critique of the “dated neologism” digital humanities. In a cute move he questions what makes humanists digital? Is it using email or building a web page? He rightly points out that the definition has been changing as the technology does, though I’m not sure that is a problem. The digital humanities should change – that is what makes disciplines vital. He also feels we get the mix of computing and the humanities wrong; that we should be using humanities methods to understand technology not the other way around.

There is a sense in which historians of information technology work at the intersection of computing and the humanities. Certainly we have attempted, with rather less success, to interest humanists in computing as an area of study. Yet our aim is, in a sense, the opposite of the digital humanists: we seek to apply the tools and methods of the humanities to the subject of computing…

On this I think he is right – that we should be doing both the study of computing through the lens of the humanities and experimenting with the uses of computing in the humanities. I would go further and suggest that one way to understand computing is to try it on that which you know and that is the distinctive contribution of the digital humanities. We don’t just “yack” about it, we try to “hack” it. We think-through technology in a way that should complement the philosophy and history of technology. Haigh should welcome the digital humanities or imagine what we could be rather than dismiss the field because we haven’t committed to only humanistic methods, however limited.

Haigh concludes with a “suspicion” I have been hearing since the 1990s – that the digital humanities will disappear (like all trends) leaving only real historians and other humanists using the tools appropriate to the original fields. He may be right, but as a historian he should ask why certain disciplines thrive and other don’t. I suspect that science and technology studies could suffer the same fate – the historians, sociologists, and philosophers could back to their homes and stop identifying with the interdisciplinary field. For that matter, what essential claim does any discipline have? Could history fade away because all of us do it, or statistics disappear because statistical techniques are used in other disciplines? Who needs math when everyone does it?

The use of computing in the other humanities is exactly why the digital humanities is thriving – we provide a trading zone for new methods and a place where they can be worked out across the concerns of other disciplines. Does each discipline have to work out how texts should be encoded for interchange and analysis or do we share enough to do it together under a rubric like computing in the humanities? As for changing methods – the methods definitive of the digital humanities that are discussed and traded will change as they get absorbed into other disciplines so … no, there isn’t a particular technology that is definitive of DH and that’s what other disciplines want – a collegial discipline from which to draw experimental methods. Why is it that the digital humanities are expected to be coherent, stable and definable in a way no other humanities discipline is?

Here I have to say that Matt Kirschenbaum has done us an unintentional disfavor by discussing the tactical use of “digital humanities” in English departments. He has led others to believe that there is something essentially mercenary or instrumental to the field that dirties it compared to the pure and uneconomical pursuit of truth to be found in science and technology studies, for example. The truth is that no discipline has ever been pure or entirely corrupt. STS has itself been the site of positioning at every university I’ve been at. It sounds from Haigh that STS has suffered the same trials of not being taken seriously by the big departments that humanities computing worried about for decades.  Perhaps STS could partner with DH to develop a richer trading zone for ideas and techniques.

I should add that many of us are in DH not for tactical reasons, but because it is a better home to the thinking-through we believe is important than the disciplines we came from. I was visiting the University of Virginia in 2001-2 and participated in the NEH funded meetings to develop the MA in Digital Humanities. My memory is that when we discussed names for the programme it was to make the field accessible. We were choosing among imperfect names, none of which could ever communicate the possibilities we hoped for. At the end it was a choice as to what would best communicate to potential students what they could study.

The Material in Digital Books

Elika Ortega in a talk at Experimental Interfaces for Reading 2.0 mentioned two web sites that gather interesting material traces in digital books. One is The Art of Google Books that gathers interesting scans in Google Books (like the image above).

The other is the site Book Traces where people upload interesting examples of marginal marks. Here is their call for examples:

Readers wrote in their books, and left notes, pictures, letters, flowers, locks of hair, and other things between their pages. We need your help identifying them because many are in danger of being discarded as libraries go digital. Books printed between 1820 and 1923 are at particular risk.  Help us prove the value of maintaining rich print collections in our libraries.

Book Traces also has a Tumblr blog.

Why are these traces important? One reason is that they help us understand what readers were doing and think while reading.

Weaponizing the Digital Humanities

Jan Christoph Meister has posted a blog about Weaponizing the Digital Humanities. His entry comes from an exchange we had, first around the paper about stylistics to psychologically profile people. (See my conference report on DH2014.) After the session we ended up talking with someone probably from the intelligence community. It is a bit startling to realize that we merit attention, if that is what it is. Certainly research on recognition of typing patterns might be of interest, but it is hard to imagine what else would be of interest.

The other side of intelligence interest in our field is our interest in surveillance. What can we learn from the intelligence agencies and the techniques they develop? I’m certainly intrigued by what they might have been able to do. What responsibilities do we have to engage the ethical and interpretative issues raised by Snowden’s revelations. My blog entry Interpreting the CSEC Presentation: Watch Out Olympians in the House! would be a attempt to interpret Snowden documents – perhaps paleography of the documents.

Meister rightly opens the ethical issue of whether our organization should have a code of ethics that touches on how our research is used. We have a code of conduct, should it extend to issues of surveillance? The humanist in me asks how other fields in the humanities have dealt with the sudden military application of their research. There was/is an issue around the involvement of anthropologists and sociologists in Petagon-funded projects.

CSDH-SCHN 2014 Conference

I have posted my CSDH-SCHN 2014 conference notes now that the conference is over. I will probably put up some notes on CGSA 2014 also in there tomorrow. We had great participation this year. You can see the conference programme here. One thing that went well is the Digital Demonstration session which was like a poster session, but with demos of neat tools and digital projects. Some themes:

  • Visualization and text mining
  • Topic modelling and mallet (see theme immediately above)
  • This moment in digital humanities in Canada
  • Studying the history of our disciplines
  • Connecting with the other humanities disciplines and organizations like Compute Canada

A World Digital Library Is Coming True!

Robert Darnton has a great essay in The New York Review of Books titled, A World Digital Library Is Coming True! This essay asks about publication and the public interest. He mentions how expensive some journals are getting and how that means that knowledge paid for by the public (through support for research) becomes inaccessible to the very same public which might benefit from the research.

In the US this trend has been counteracted by initiatives to legislate that publicly funded research be made available through some open access venue like PubMed Central. Needless to say lobbyists are fighting such mandates like the Fair Access to Science and Technology Research Act (FASTR).

Darnton concludes that “In the long run, journals can be sustained only through a transformation of the economic basis of academic publishing.” He argues for “flipping” the costs and charging processing fees to those who want to publish.

By creating open-access journals, a flipped system directly benefits the public. Anyone can consult the research free of charge online, and libraries are liberated from the spiraling costs of subscriptions. Of course, the publication expenses do not evaporate miraculously, but they are greatly reduced, especially for nonprofit journals, which do not need to satisfy shareholders. The processing fees, which can run to a thousand dollars or more, depending on the complexities of the text and the process of peer review, can be covered in various ways. They are often included in research grants to scientists, and they are increasingly financed by the author’s university or a group of universities.

While I agree on the need to focus on the public good, I worry that “flipping” will limit who gets published. In STEM fields where most research is funded one can build the cost of processing fees into the funding, but in the humanities where much research is not funded, many colleagues will have to pay out of pocket to get published. Darnton mentions how at Harvard (his institution) they have a program that subsidizes processing fees … they would, and therein lies the problem. Those at wealthy institutions will now have an advantage in that they can afford to publish in an environment where publishers need processing fees while those not subsidized (whether private scholars, alternative academics, or instructors) will have to decide if they really can afford to. Creating an economy where it is not the best ideas that get published but those of an elite caste is not a recipe for the public good.

I imagine Darnton recognizes the need for solutions other than processing fees and, in fact, he goes on to talk about the Digital Public Library of America and OpenEdition Books as initiatives that are making monographs available online for free.

I suspect that what will work in the humanities is finding funding for the editorial and publishing functions of journals as a whole rather than individual articles. We have a number of journals in the digital humanities like Digital Humanities Quarterly where the costs of editing and publishing are borne by individuals like Julian Flanders who have made it a labor of love, their universities that support them, and our scholarly association that provides technical support and some funding. DHQ doesn’t charge processing fees which means that all sorts of people who don’t have access to subsidies can be heard. It would be interesting to poll the authors published and see how many have access to processing fee subsidies. It is bad enough that our conferences are expensive to attend, lets not skew the published record.

Which brings me back to the public good. Darnton ends his essay writing about how the DPLA is networking all sorts of collections together. It is not just providing information as a good, but bringing together smaller collections from public libraries and universities. This is one of the possibilities of the internet – that distributed resources can be networked into greater goods rather than having to be centralized. The DPLA doesn’t need to be THE PUBLIC LIBRARY that replaces all libraries the way Amazon is pushing out book stores. The OpenEdition project goes further and offers infrastructure for publishing knowledge to keep costs down for everyone. A combination of centrally supported infrastructure that is used by editors that get local support (and credit) will make more of a difference than processing fees, be more equitable, and do more for public participation, which is a good too.

The DH Experience Game on Vimeo

We have put up a video of the The DH Experience Game on Vimeo in the INKE Vimeo channel.

John Montague and Luciano Frizzera have designed a cool game that allows people to play at collaboratively completing digital humanities projects. We are now working with GO::DH to make the centers and projects real ones from around the world.