On Friday I delivered the opening keynote at an conference Colloque ACFAS 2017 « La publication savante en contexte numérique » organized by CRIHN. The keynote was on “Hermeneutica: Le dialogue du texte et le jeu de l’interprétation,” presenting work Stéfan Sinclair and I have been doing on how to integrate text and tools. The context of the talk was a previous colloquium organized by CRIHN:
Après un premier colloque à l’ACFAS du Centre de Recherche Interuniversitaire sur les Humanités Numériques en 2014 (sur les besoins d’analyser l’impact du numérique sur les sciences humaines), l’objectif de notre colloque en 2017 est de repenser d’un point de vue théorique et pratique l’édition savante à l’époque du numérique.
In the talk I demonstrated a new tool based on Eliza that we call Veliza. Veliza implements Weizenbaum’s Eliza algorithm but adds the ability to pull a random sentence from the text you are analyzing and send that to the machine. The beta version (not the standard one yet) I was using had two other features.
It allows you to ask for things like “the occurrences of father” and it responds with a Voyant panel in the dialogue.
Second, it allows you to edit the script that controls Veliza so you can ask it to respond to different keywords.
This talk was actually the first time we have showed either Veliza or Spiral. Both are still in beta, but will be coming soon to the distribution Voyant.
Thanks to Humanist I came across this project that offers bwFLA: Emulation as a Service. This will become increasingly important in the digital humanities and game studies as more and more content-rich projects become unreadable on contemporary machines. Just think of the CD-ROM. How many of us still have a CD drive on our computer? I think I have a USB drive somewhere … not sure where it is though. Emulation projects like this and MAME are becoming more and more important to preservation and history.
Researchers in the humanities and social sciences are using digital infrastructure to help advance their research as well, and a Canadian-made tool called Voyant is allowing those who work with texts to do it with ease.
The story points out that Voyant may have more unique users than any other tool on Compute Canada, which is gratifying to read. This doesn’t mean more research is supported by Voyant, or more important research; comparisons are not really useful. What is more important is that the way humanists use infrastructure is different and being recognized. Humanists typically aren’t doing “big science.” They don’t need thousands of processors and batch interfaces. They want a more interactive and “always on” type of service. Compute Canada has listened and has been supporting our style/pace of infrastructure. Bravo!
Every year the University of Alberta Libraries organizes a Research Data Management Week to bring faculty, staff, students, and community data specialists together around data management. I was part of an panel session today on the subject. One of the issues we discussed with was how to deal with a likely requirement from funding agencies like SSHRC that Research Data Management Plans be submitted with grants. Some thoughts on this:
Researchers will initially need help understanding what a DMP (Data Management Plan) is. The Portage Network DMP Assistant can help, but many will need an introduction to the issues.
Research universities and libraries will need to develop strategies for supporting projects to meet their new obligations. We will need the infrastructure to match.
There will be push back from some scholarly associations. Others, like CSDH-SCHN will welcome this as we have policies that support the idea.
There is a cost to properly curating, documenting and depositing research data. This cost comes typically at the end of projects when the funds are spent. We will need to do a better job budgeting for data management/deposit.
We need to develop small grants and services for projects to help them go the last mile in curating and depositing their content. At the Kule Institute we developed CRAfT grants in partnership with the UofA Libraries. These grants are meant for prototyping digital archives. Now we need to think about a program to help with the final archiving.
Reading Cartographies of Time by Rosenberg and Grafton, I was struck by one early visual presentation of time by Peter Poitiers. It has both the features of a family tree or genealogy and a timeline. It is spread over pages in a manuscript with text in between vertically flowing lines. there are little portraits of the people. What can we learn from the imaginative designs of past designers of time charts?
This English manuscript was created in the early thirteenth century soon after the death of its author, Peter of Poitiers, theologian and Chancellor of the University of Paris from 1193 to 1205. It is an early copy of his text, the Compendium historiae in genealogia Christi. Intended as a teaching aid, the work provides a visual genealogy of Christ comprised of portraits in roundels, accompanied by a text discussing the historical background of Christ’s lineage.
On May 4th we will be running our annual online Around the World conference. This year the topic is Digital Media in the Post-Truth Era. Anyone can tune in to hear panels talking on this subject from around the world.
The New York Times has a nice article about how, Robert Taylor, Innovator Who Shaped Modern Computing, Dies at 85. As director of the Information Processing Techniques Office, part of the Advanced Research Projects Agency, Taylor commissioned the development of what became the ARPANET and then Internet. He later led the group at Xerox PARC that developed the Alto computer, a early imagining of what personal computing could be. He also supported J.C.R. Licklider and wrote a paper on The Computer as a Communication Device with him. That paper starts with,
In a few years, men will be able to communicate more effectively through a machine than face to face.
Domenico Fiormonte has recently blogged about an interesting document he has by Father Busa that relates to a difficult moment in the history of the digital humanities in Italy in 2002. The two page “Conditional Agreement”, which I translate below, was given to Domenico and explained the terms under which Busa would agree to sign a letter to the Minister (of Education and Research) Moratti in response to Moratti’s public statement about the uselessness of humanities informatics. A letter was being prepared to be signed by a large number of Italian (and foreign) academics explaining the value of what we now call the digital humanities. Busa had the connections to get the letter published and taken seriously for which reason Domenico visited him to get his help, which ended up being conditional on certain things being made clear, as laid out in the document. Domenico kept the two pages Busa wrote and recently blogged about them. As he points out in his blog, these two pages are a mini-manifesto of Father Busa’s later views of the place and importance of what he called textual informatics. Domenico also points out how political is the context of these notes and the letter eventually signed and published. Defining the digital humanities is often about positioning the field in the larger academic and public political spheres we operate in.
Virtual reality, after bombing in the 1990s is back again. We have a Time cover, affordable headsets, and some games.
Jérémie pointed me to a couple of interesting links on VR. One is a short story by Stanley G. Weinbaum titled Pygmalion’s Spectacles from 1935 that tells the story of spectacles that can immerse you in another world. The BBC has created a virtual reality experience of being a Syrian refugee called We Wait. Vice has a short documentary Stepping Into the Screen that emphasizes the potential psychological and ethical impact of VR. To my mind the attention to impact is a way of hyping VR. Is it really that different or are we just hoping it will be?
In the 1990s many of us got sick trying VR headsets which has me wondering if anything is different this time?