Announcing the first issue of the Journal of the Japanese Association for Digital Humanities. I am on the Editorial board of the Journal, but the real credit goes to Charles Muller, Christian Wittern and Kiyonori Nagasaki who are the working editors. This journal represents the maturing of the Japanese digital humanities scene. They have a Japanese Association (JADH) which was founded in 2011, and became constituent organization of ADHO in 2013. Now they have a journal. As Charles Muller, Editor-in-Chief, puts it in his “Dear Readers”,
While Digital Humanities has been practiced in Japan for more than two decades, up to now, little is known outside of Japan regarding the content of Japan advancements in this field. We therefore aim to rectify this situation by initiating a first-tier peer reviewed international journal published in English. Although we hope to be able to shed light on projects in developments in Japan, we will be accepting article submissions from DH practitioners around the world on a broad range of topics.
This Institute focused not only technology in learning but also on important issues around the ethics of different learning models that involve technology. Ways of using technology to get active participation rather than just broadcasting video came up. Ways of thinking about students in collaborative projects came up – we need to get beyond the apprentice model and think of them as “citizen scholars.”
Panopticonopolis (try saying it) by Misha Lepetic has mostly entries on cities, some of which appear in 3 Quarks Daily. Another article on The Forgotten Archipelago asks what happened to the Soviet ZATP cities – the special purpose, closed and hidden cities set up for secret research. What happened when the Soviet Union collapsed and the federal government could no longer fund these single-purpose cities?
I was led to the panopticonopolis from an article on Blob Justice, Part 1 which looks at the herd shaming that is taking place on the Internet starting with Cecil the lion. I can’t help wondering if this sort of Internet stampede is related to gamergate and Anonymous.
[h]e found a bundle of 10 Medicare numbers selling for 22 bitcoin, or $4,700 at the time. General medical records sell for several times the amount that a stolen credit card number or a social security number alone does. The detailed level of information in medical records is valuable because it can stand up to even heightened security challenges used to verify identity; in some cases, the information is used to file false claims with insurers or even order drugs or medical equipment. Many of the biggest data breaches of late, from Anthem to the federal Office of Personnel Management, have seized health care records as the prize.
These data show that the real problem is not that English is the dominant language of academic publications (and of DH), but that both Anglophone and a high percentage of non-Anglophone colleagues barely use/quote non-Anglophone sources in their research.
I can’t help thinking that the internet has allowed the big to get even bigger. The dominance of English in academic circles is exacerbated by the instant availability of English research. National languages don’t even have location as an advantage on the internet.
What can we do about it? Miran had a nice reply on Humanist (to the original posting by Greg Crane that was also on Humanist.) Domenico suggests that we all have to take some responsibility, especially those of us who have the “free ride” of being native English writers.
It is the responsibility of dominant languages and cultures to translate from marginal or less influential languages.
I participated in a public panel on Building Communities and Networks in the Humanities where I talked about some of the forms of public engagement that we are trying at the Kule Institute including the Around the World Conference.
I helped Stéfan Sinclair with a workshop on Voyant 2.0 (link goes to current version which will soon be 2.0).
I gave a paper with Stéfan Sinclair on “Talking about Programming the Digital Humanities” that traced a history of the discussion about programming and tools in the digital humanities.
Finally, John Montague gave a paper on “Exploring Large Datasets with Topic Model Visualizations” that I was involved in. This paper discussed a visualization for exploring the results of topic modelling that you can try in prototype here.
It is hard to summarize a whole conference, but I would note some of the questions that the new scholars posed in the unconference are worth thinking about:
How does one learn about the field of digital humanities?
How does one learn skills in the digital humanities?
How does one teach the digital humanities?
What are the ethical issues in digital work in the humanities?
We have recently deposited two research archives here at the University of Alberta. One is the John B. Smith Archive. You can download bundles or the complete archive which can be found at http://hdl.handle.net/10402/era.41201. Amy Dyrbye and I worked with John B. Smith to assemble this, document it and deposit it in ERA (the Education and Research Archive).