Ted Hewitt speaks at University of Alberta

Text Hewitt spoke today on “The Perils and Prospects of Digital Scholarship in the 21st Century Canada: Tri-Agency Research Data Initiative” at our Research Data Management week. Some of the things he talked about follow.

Canada is not leading on data stewardship. We need to catch up so that we can take advantage of what the world has to offer and we need to offer what Canada has to the world. Data management capacity is increasingly linked to Canada’s international competitiveness.

We used to do a literature review when starting a project. Now we also look for data sets that we can use so we aren’t re-searching to create useful data.

Continue reading Ted Hewitt speaks at University of Alberta

Around the World Conference

ATW_Logo

Last week we held our third Around the World Conference on the subject of “Big Data”. We had some fabulous panels from countries including Ireland, Canada, Israel, Nigeria, Japan, China, Australia, USA, Belgium, Italy, and Brazil.

The Around the World Conference streams speakers and panels from around the world out to everyone on the net. We also edit and archive the video clips. This model allows for a sustainable conversation across continents that doesn’t involve flying people around. It allows a lot people who wouldn’t usually be included to speak. We also find there are technical hiccups, but that happens in on-site conferences too.

Science 2.0 and Citizen Research

This week I attended the second Science 2.0 conference held in Hamburg, Germany. (You can see my research notes here.) The conference dealt with issues around open access, open data, citizen science, and network enabled science. I was one of two Canadian digital humanists presenting. Matthew Hiebert from the University of Victoria talked about the social edition and work from the Electronic Textual Cultures Lab and Iter. It should be noted that in Europe the word “science” is more inclusive and can include the humanities. This conference wasn’t just about how open data and crowdsourcing could help the natural sciences – it was about how research across the disciplines could be supported with virtual labs and infrastructure.

I gave a paper on “New Publics for the Humanities” that started by noting that the humanities no longer engage the public. The social contract with the public that supports us has been neglected. I worry that if the university is disaggregated and the humanities unbundled from the other faculties (the way newspapers have been hit by the internet and the unbundling of services) then people will stop paying for the humanities and much of the research we do. We will end up with cheaper, research poor, colleges that provide lots of higher education without the research, or climbing walls. Only in the elite private universities will the humanities survive, and in those they will survive as a marker of their class status. You will be able to study ancient languages at elite schools because any degree is good from an elite school provides.

Of course, the humanities will survive outside the university, and may become healthier with the downsizing of the professional (or professorial) humanities, but we run the danger of unthinkingly losing a long tradition of thinking critically and ethically. An irony to be sure – losing thinking traditions through the lack of public reflection on the consequences of disruptive change.

Drawing on Greg Crane, I then argued that citizen research (forms of crowdsourcing) can re-engage the publics we need to support us and reflect with us. Citizen research can provide an alternative way of structuring research in anticipation of defunding of the humanities research function. I illustrated my point by showing a number of examples of humanities crowdsourcing projects from the OED (pre-computer volunteer research) to the Dictionary of Words in the Wild. If I can find the time I will write up the argument to see where it goes.

My talk was followed by thorough one on citizen science in environmental studies by Professor Aletta Bonn of the Citizens create knowledge project – a German platform for citizen science. We need to learn from people like Dr. Bonn who are studying and experimenting with the deployment of citizen research. One point she made was the importance of citizen co-design. Most projects enlist citizens in repetitive micro-tasks designed by researchers. What if the research project were designed from the beginning with citizens? What would that mean? How would that work?

Building Research Capacity Across the Humanities

On Monday I gave a talk at the German Institute for International Educational Research (DIPF) on:

Building Research Capacity Across the Humanities and Social Sciences: Social Innovation, Community Engagement and Citizen Science

The talk began with the sorry state of public support for the humanities. We frequently read how students shouldn’t major in the humanities because there are no jobs and we worry about dropping enrolments. The social contract between our publics (whose taxes pay for public universities) and the humanities seems broken or forgotten. We need to imagine how to re-engage the local and international communities interested in what we do. To that end I proposed that we:

  • We need to know ourselves better so we can better present our work to the community. It is difficult in a university like the University of Alberta to know what research and teaching is happening in the social sciences and humanities. We are spread out over 10 different faculties and don’t maintain any sort of shared research presence.
  • We need to learn to listen to the research needs of the local community and to collaborate with the community researchers who are working on these problems. How many people in the university know what the mayor’s priorities are? Who bothers to connect the research needs of the local community to the incredible capacity of our university? How do we collaborate and support the applied researchers who typically do the work identified by major stakeholders like the city. Institutes like the Kule Institute can help document the research agenda of major community stakeholders and then connect university and community researchers to solve them.
  • We need to learn to connect through the internet to communities of interest. Everything we study is of interest to amateurs if we bother to involve them. Crowdsourcing or “citizen science” techniques can bring amateurs into research in a way that engages them and enriches our projects.

In all three of these areas I described projects that are trying to better connect humanities research with our publics. In particular I showed various crowdsourcing projects in the humanities ending with the work we are now doing through the Text Mining the Novel project to imagine ways to crowdsource the tagging of social networks in literature.

One point that resonated with the audience at DIPF was around the types of relationships we need to develop with our publics. I argued that we have to learn to co-create research projects rather than “trickle down” results. We need to develop questions, methods and answers together with community researchers rather think that do the “real” research and then trickle results down to the community. This means learning new and humble ways of doing research.

Terry Eagleton: The death of universities

The Guardian has an essay by Terry Eagleton on The death of universities. The article asks (and answers),

Are the humanities about to disappear from our universities? The question is absurd. It would be like asking whether alcohol is about to disappear from pubs, or egoism from Hollywood. Just as there cannot be a pub without alcohol, so there cannot be a university without the humanities. If history, philosophy and so on vanish from academic life, what they leave in their wake may be a technical training facility or corporate research institute. But it will not be a university in the classical sense of the term, and it would be deceptive to call it one.

I wish I were so sure of this logical argument, but I fear that people are quite willing to call something a university even without many of the humanities just as the university in centuries past was just as much a university for not having many of the fields now seen as essential (like Computer Science, Cognitive Science, Bioinformatics, even Engineering.)

I can imagine a university where many of the humanities end up in the Faculty of Education (which does prepare people for jobs as teachers.) We would have the department of English Education, for example. Would people bemoan the loss of the humanities if many of its questions ended up housed elsewhere?

For that matter there are some that argue that preserving the humanities may be a cloak for preserving a particular idea of humanism. For example, here is Tony Davies at the end of his excellent short book Humanism:

All humanisms, until now, have been imperial. They speak of the human in the accents and the interests of a class, a sex, a race, a genome. Their embrace suffocates those whom it does not ignore. (p. 141; location 2372 in Kindle)

To claim that a university would not be a university if it didn’t maintain a particular collection of intellectual traditions would be begging the question (actually begging all sorts of questions). We simply can’t expect a historical definition to save what we care for. We must be part of the ongoing definition whether as collaborators or critics, which raises the question of how far to collaborate and when to dig in heels and yell like hell?

UNIty in diVERSITY talk on “Big Data in the Humanities”

Last week I gave a talk for the UNIty in diVERSITY speaker series on “Big Data in the Humanities.” They have now put that up on Vimeo. The talk looked at the history of reading technologies and then some of the research at U of Alberta we are doing around issues of what to do with all that big data.

CRediT: Open Standard for Roles in Research

The CRediT Project now has a Proposed Taxonomy for assigning credit. They have identified a short list of roles:

  • Conceptualization
  • Methodology
  • Software
  • Validation
  • Formal Analysis
  • Investigation
  • Resources
  • Data Curation
  • Writing – Original Draft
  • Writing – Review and Edit
  • Visualization
  • Supervision
  • Project Administration
  • Funding Acquisiton

They are looking for feedback.

Trans-Atlantic Platform

The Trans-Atlantic Platform: Social Sciences and Humanities is a collaboration among social science and humanities funders in different countries. In their About Us page they describe the purpose of this collaborative platform thus:

This Trans-Atlantic Platform will enhance the ability of funders, research organizations and researchers to engage in transnational dialogue and collaboration. It will identify common challenges and promote a culture of digital scholarship in social science and humanities research. It will facilitate the formation of networks within the social sciences and humanities and help connect them with other disciplines. It will also heighten awareness of the crucial role the social sciences and humanities play in addressing 21st century challenges.

The T-AP is co-chaired by the (then) President of SSRHC and the Netherlands social sciences funding agency. It likewise seems to be co-administered by SSRHC and NWO Social Sciences. The T-AP got funding that helped launch it from the European Commission 7th Framework Programme.

What is interesting is who is in T-AP. The German DFG and Americans NEH/NSF are down as “associated partners”. Brazilian, Canadian, Finish, French, Mexican, Dutch, Portuguese, and UK funding organizations are “key partners.” (See Partners page.)

I also have questions about T-AP:

  • Does this mean we will see more programmes like Digging into Data that can fund teams across countries? Wouldn’t it be great if a project could include the right people rather than the right people in Canada?
  • Or, will we see thematic collaborations like call on Sustainable Urban Development?
  • Will they try to harmonize research data policies?

A World Digital Library Is Coming True!

Robert Darnton has a great essay in The New York Review of Books titled, A World Digital Library Is Coming True! This essay asks about publication and the public interest. He mentions how expensive some journals are getting and how that means that knowledge paid for by the public (through support for research) becomes inaccessible to the very same public which might benefit from the research.

In the US this trend has been counteracted by initiatives to legislate that publicly funded research be made available through some open access venue like PubMed Central. Needless to say lobbyists are fighting such mandates like the Fair Access to Science and Technology Research Act (FASTR).

Darnton concludes that “In the long run, journals can be sustained only through a transformation of the economic basis of academic publishing.” He argues for “flipping” the costs and charging processing fees to those who want to publish.

By creating open-access journals, a flipped system directly benefits the public. Anyone can consult the research free of charge online, and libraries are liberated from the spiraling costs of subscriptions. Of course, the publication expenses do not evaporate miraculously, but they are greatly reduced, especially for nonprofit journals, which do not need to satisfy shareholders. The processing fees, which can run to a thousand dollars or more, depending on the complexities of the text and the process of peer review, can be covered in various ways. They are often included in research grants to scientists, and they are increasingly financed by the author’s university or a group of universities.

While I agree on the need to focus on the public good, I worry that “flipping” will limit who gets published. In STEM fields where most research is funded one can build the cost of processing fees into the funding, but in the humanities where much research is not funded, many colleagues will have to pay out of pocket to get published. Darnton mentions how at Harvard (his institution) they have a program that subsidizes processing fees … they would, and therein lies the problem. Those at wealthy institutions will now have an advantage in that they can afford to publish in an environment where publishers need processing fees while those not subsidized (whether private scholars, alternative academics, or instructors) will have to decide if they really can afford to. Creating an economy where it is not the best ideas that get published but those of an elite caste is not a recipe for the public good.

I imagine Darnton recognizes the need for solutions other than processing fees and, in fact, he goes on to talk about the Digital Public Library of America and OpenEdition Books as initiatives that are making monographs available online for free.

I suspect that what will work in the humanities is finding funding for the editorial and publishing functions of journals as a whole rather than individual articles. We have a number of journals in the digital humanities like Digital Humanities Quarterly where the costs of editing and publishing are borne by individuals like Julian Flanders who have made it a labor of love, their universities that support them, and our scholarly association that provides technical support and some funding. DHQ doesn’t charge processing fees which means that all sorts of people who don’t have access to subsidies can be heard. It would be interesting to poll the authors published and see how many have access to processing fee subsidies. It is bad enough that our conferences are expensive to attend, lets not skew the published record.

Which brings me back to the public good. Darnton ends his essay writing about how the DPLA is networking all sorts of collections together. It is not just providing information as a good, but bringing together smaller collections from public libraries and universities. This is one of the possibilities of the internet – that distributed resources can be networked into greater goods rather than having to be centralized. The DPLA doesn’t need to be THE PUBLIC LIBRARY that replaces all libraries the way Amazon is pushing out book stores. The OpenEdition project goes further and offers infrastructure for publishing knowledge to keep costs down for everyone. A combination of centrally supported infrastructure that is used by editors that get local support (and credit) will make more of a difference than processing fees, be more equitable, and do more for public participation, which is a good too.