Social Digital Scholarly Editing

On July 11th and 12th I was at a conference in Saskatoon on Social Digital Scholarly Editing. This conference was organized by Peter Robinson and colleagues at the University of Saskatchewan. I kept conference notes here.

I gave a paper on “Social Texts and Social Tools.” My paper argued for text analysis tools as a “reader” of editions. I took the extreme case of big data text mining and what scraping/mining tools want in a text and don’t want in a text. I took this extreme view to challenge the scholarly editing view that the more interpretation you put into an edition the better. Big data wants to automate the process of gathering and mining texts – big data wants “clean” texts that don’t have markup, annotations, metadata and other interventions that can’t be easily removed. The variety of markup in digital humanities projects makes it very hard to clean them.

The response was appreciative of the provocation, but (thankfully) not convinced that big data was the audience of scholarly editors.

Around the World Symposium on Digital Culture

Tomorrow we are organizing an Around the World Symposium on Digital Culture. This symposium brings together scholars from different countries talking about digital culture for about 17-20 hours as it goes from place to place streaming their talks and discussions. The Symposium is being organized by the Kule Institute for Advanced Study here at the University of Alberta. Visit the site to see the speakers and to tune in.

Please join in using the Twitter hashtag #UofAworld

Digital Classics Symposium in Buffalo

I am heading home the day after giving the closing remarks at a conference in Buffalo on Word, Space, Time: Digital Perspectives on the Classical World. This is the first conference of the new Digital Classics Association. It was a gem of a conference where I learned about a succession of neat projects. Here are some notes. My laptop ran out of juice at times so I was not able to take notes on everything.

  • Greg Crane gave the opening keynote announcing his new Humbolt appointment and what he is gong to try to do there. He announced that he wanted to: 1) Advance the role of Greco-Roman culture and Classical Greek and Latin in human intellectual life as broadly and as deeply as possible in a global world. And 2) To blow the dust off the simple, cogent and ancient term philology and to support an open philology that can, in turn, support a dialogue among civilizations. He talked about the history and importance of philology and then announced the Open Philology Project. This project has as its goals:
    • Open greek and latin texts (the TLG is not open)
    • Comprehensive open data about the classical world
    • Multitext digital editions
    • Annotations
    • Deep linguistic annotation
    • Full workflow through true digital edition

    This is a worthy and ambitious vision and I tried to remind people of it at the end. Classics is the right size and has the right interdiscplinarity to be able to model a comprehensive system.

  • Crane talked about Alpheios, a text editing and learning system that Perseus is connecting to. Monica Berti showed her work on fragmenta in Alpheios and I later learned that this is a philanthropically funded project. Berti’s demo of how she is handling fragmenta is at http://services.perseus.tufts.edu/berti_demo/
  • Marco Büchler gave a tantalizing paper on “Using Google PageRank to detect text reuse”. His was not the only text reuse project – it is technique that is important to classicists who study how classical authors have been quoted, alluded to, and reused over time. Büchler’s software is TRACER which will be available once he has some documentation. I think the idea of using a PageRank to sort hits is a great idea and would love to play with his tools. He encouraged interested parties to join a Google group on text reuse.
  • Walter Scheidel showed the Orbis system in a paper on “Redrawing the map of the Roman world.” Orbis is a brilliant tool for measuring time and cost for travel in the Roman world. It is a great example of spatial analysis.
  • Tom Elliot talked about the Pleiades project and how they have around 34,000 places registered and linked. He was initially skeptical about semantic web technologies and RDF, but is now using it in a way that shows what we can do in the humanities with this approach. I am struck by how Plieades now provides a service to all sorts of other projects. What Classics now need is similar projects for people, passages (texts), periods (events and time), and other primitives. Classics could set an example of coordinated semantic data.
  • Ryan Horne wrapped a great session on geospatial work with a presentation on “Mapping antiquity a-la-carte: a GIS interface of the ancient world”. He showed Antiquity À la carte which allows you to generate all sorts of maps of the Classical world. Great tool for teachers.
  • Kevin D. Fisher gave a fascinating presentation on “Digital approaches to ancient cities: The Kalavasos and Maroni built environments project, Cyprus.” In The Crane Project they are using all sorts of cool technology like 3D laser scanners and ground penetrating radar to map their dig in Cyprus. I liked how was using techniques to model how the environments were lived in. What could you see from where, what were the accessible rooms in buildings?
  • My favorite project of conference was Christopher Johanson’s visual argument on RomeLab: Performance on the ephemeral stage. He presented an argument about temporary stages in the Roman forum that was made through a virtual Rome that you can travel around through the browser. The argument is in a sequence points that can be opened and which will move you around the world to see what the argument is about. His paper was an example of a visual argument through RomeLab and by extension about RomeLab. Despite a technical glitch, it was an impressive performance that made its point on so many levels.
  • I attended a neat little workshop on R led by Jeff Rydberg-Cox. His learning materials are at http://daedalus.umkc.edu/StatisticalMethods/index.html and he pointed us to a neat tutorial at http://tryr.codeschool.com/.
  • At the end there was a great panel on Literary Criticism and Digital Methods. Matt Jockers presented is work on macroanalysis of 19th century literature. He had a neat word cloud visualization of topic modeling results. Patrick J. Burns was very good on “Distant reading alliteration in Latin poetry.” He was very good on walking us through his method and illustrating it with humour. Neil Bernstein talked about the Tesserae project. The Tesserae project is looking at text reuse and has neat tools online for people to see how author A gets reused in author B.

I gave the closing remarks and I tried to draw attention to the history of the vision of a perfect reading (or philology) machine. I think took advantage of being the last to offer suggestions as to how digital classics might move research forward:

  • The Digital Classics Association should take seriously Greg Cranes invitation to influence his Open Philology Project. Classics is, for various reason, in a unique position to imagine comprehensive research and learning environments.
  • They should think about primitives and how they support them. What Pleiades has done for place other should think of doing for people, periods (events and time), buildings and things, and so on. The idea would be to have a network of projects managing semantic data about the things that matter to Classicists.
  • I encouraged people to think about how to include the larger public into research using crowdsourcing and gaming.
  • I encouraged them to think about how digital research is shared and assessed. They should look at the work from the MLA on assessment and the DCA could adapt stuff for Classics.
  • Finally I talked a bit out infrastructure and the dangers of developing infrastructure prematurely. I called for infrastructure experiments.

I think the DCA will be putting up a video of my closing remarks.

International Conference on Japan Game Studies

I am part of a team putting together a International Conference on Japan Game Studies. We are organizing this with the Prince Takamado Japan Centre at the University of Alberta and the Ritsumeikan Center for Game Studies in Kyoto. The deadline for proposals is in two weeks. We are looking for papers on:

  • Cross cultural study of games and toys
  • Localization of games
  • Assessment of educational aspects of games
  • Preservation of games and game culture
  • Understanding player culture
  • Game industry (in Japan and transnationally)
  • Games and transmedia phenomena
  • Games of chance

MLA 2013 Conference Notes

I’ve just posted my MLA 2013 convention notes on philosophi.ca (my wiki). I participated in a workshop on getting started with DH organized by DHCommons, gave a paper on “thinking through theoretical things”, and participated in a panel on “Open Sesame” (interoperability for literary study.)

The sessions seemed full, even the theory one which started at 7pm! (MLA folk are serious about theorizing.)

At the convention the MLA announced and promoted a new digital MLA Commons. I’ve been poking around and trying to figure out what it will become. They say it is “a developing network linking members of the Modern Language Association.” I’m not sure I need one more venue to link to people, but it could prove an important forum if promoted.

Digital Humanities Talks at the 2013 MLA Convention

The ACH has put together a useful Guide to Digital-Humanities Talks at the 2013 MLA Convention. I will presenting at various events including:

Conference Report of DH 2012

I’m at Digital Humanities 2012 in Hamburg. I’m writing a conference report on philosophi.ca. The conference started with a keynote by Claudine Moulin that touched on research infrastructure. Moulin was the lead author of the European Science Foundation report on Research Infrastructure in the Humanities (link to my entry on this). She talked about the need for a cultural history of research infrastructure (which the report actually provides.) The humanities should not just import ideas and stories about infrastructure. We should use this infrastructure turn to help us understand the types of infrastructure we already have; we should think about the place of infrastructure in the humanities as humanists.