Data Management Plan Recommendation

September 25th, 2015

Today I deposited a Data Management Plan Recommendation for Social Science and Humanities Funding Agencies ( our institutional repository ERA. This report/recommendation was written by Sonja Sapach with help from me and Catherine Middleton. We recommended that:

Agencies that fund social science and humanities (SSH) research should move towards requiring a Data Management Plan (DMP) as part of their application processes in cases where research data will be gathered, generated, or curated. In developing policies, funding agencies should consult the community on the values of stewardship and research that would be strengthened by requiring DMPs. Funding agencies should also gather examples and data about reuse of archived data in the social sciences and humanities and encourage due diligence among researchers to make themselves aware of reusable data.

On the surface the recommendation seems rather bland. SSHRC has required the deposit of research data they fund for decades. The problem, however, is that few of us pay attention because it is one more thing to do, and something that shares hard-won data with others that you may want to continue milking for research. What we lack is a culture of thinking of the deposit of research data as a scholarly contribution the way the translation and edition of important cultural texts is. We need a culture of stewardship as a TC3+ (tri-council)  document put it. See Capitalizing on Big Data: Toward a Policy Framework for Advancing Digital Scholarship in Canada (PDF).

Given the potential resistance of colleagues it is important that we understand the arguments for requiring planning around data management and that is one of the things we do in this report. Another issue is how to effectively require at the funding proposal end something (like a Data Management Plan) that would show how the researchers are thinking through the issue. To that end we document the approaches of other funding bodies. The point is that this is not actually that new and some research communities are further ahead.

At the end of the day, what we really need is a recognition that depositing data so that it can be used by other researchers is a form of scholarship. Such scholarship can be assessed like any other scholarship. What is the data deposited and what is its quality? How is the data deposited? How is it documented? Can it have an impact?

You can find this document also at Catherine Middleton’s web site and Sonja Sapach’s web site.


What Ever Happened to Project Bamboo?

September 25th, 2015

What Ever Happened to Project Bamboo? by Quinn Dombrowski is one of the few honest discussions about the end of a project. I’ve been meaning to blog this essay which follows on her important conference paper at DH 2013 in Nebraska (see my conference report here which comments on her paper.) The issue of how projects fail or end rarely gets dealt with and Dombrowski deserves credit for having the courage to document the end of a project that promised so much.

I blog about this now as I just finished a day-long meeting of the Leadership Council for Digital Infrastructure where we discussed a submission to Industry Canada that calls for coordinated digital research infrastructure. While the situation is different, we need to learn from projects like Bamboo when we imagine massive investment in research infrastructure. We all know it is important, but doing it right is not as easy as it sounds.

Which brings me back to failure. There are three types of failure:

  • The simple type we are happy to talk about where you ran an experiment based on a hypothesis and didn’t get positive results. This type is based on a simplistic model of the scientific process which pretends to value negative results as much as positive ones. We all know the reality is not that simple and, for that matter, that the science model doesn’t really apply to the humanities.
  • The messy type where you don’t know why you failed or what exactly failed. This is the type where you promised something in a research or infrastructure proposal and didn’t deliver. This type is harder to report because it reflects badly on you. It is an admission that you were confused or oversold your project.
  • The third and bitter type is the project that succeeds on its own terms, but is surpassed by the disciplines. It is when you find your research isn’t current any longer and no one is interested in your results. It is when you find yourself ideologically stranded doing something that someone important has declared critically flawed. It is a failure of assumptions, or theory, or positioning and no one wants to hear about this failure, they just want to avoid it.

When people like Willard McCarty and John Unsworth call for a discussion of failure in the digital humanities they describe the first type, but often mean the second. The idea is to describe a form of failure reporting similar to negative results – or to encourage people to describe their failure as simply negative results. What we need, however, is honest description of the second and third types of failure, because those are expensive. To pretend some expensive project that slowly disappeared in missunderstanding was simply an experiment is missing what was at stake. This is doubly true of infrastructure because infrastructure is not supposed to be experimental. No one pays for roads and their maintenance as an experiment to see if people will take the road. You should be sure the road is needed before building.

Instead, I think we need to value research into infrastructure as something independent of the project owners. We need to do in Canada what the NSF did – bring together research on the history and theory of infrastructure.

Computers in classroom have ‘mixed’ impact on learning: OECD report

September 16th, 2015

The Globe and Mail and other sources are reporting that Computers in classroom have ‘mixed’ impact on learning. This is based on an OECD report titled Students, Computers and Learning: Making the Connection. The overall conclusion is that teaching is about the individual student and can’t be automated. Computers aren’t necessarily good for learning – they should be used for specific projects and used to teach real-world digital skills.

Students who use computers moderately at school tend to have somewhat better learning outcomes than students who use computers rarely. But students who use computers very frequently at school do a lot worse in most learning outcomes, even after accounting for social background and student demographics. (p. 3 of Report)

The Globe quotes Prof. Slotta of OISE to the effect that:

Technology is most effective in the classroom when it is used to develop skills similar to those that adults are using in everyday life, such as finding resources, critiquing arguments, communicating with peers, solving problems and working with data…

Skimming the report and the slide deck shows a complex picture where often countries like Japan have fewer computers in classrooms and do better on learning. Massive investment in computers like that of school boards who get laptops for every child doesn’t seem to lead to improvements in learning.

Put simply, ensuring that every child attains a baseline level of proficiency in reading and mathematics seems to do more to create equal opportunities in a digital world than can be achieved by expanding or subsidising access to high-tech devices and services. (p. 3 of Report)

The report also looked at loneliness and confirmed what parents have suspected,

Last but not least, most parents and teachers will not be surprised by the finding that students who spend more than six hours on line per weekday outside of school are particularly at risk of reporting that they feel lonely at school, and that they arrived late for school or skipped days of school in the two weeks prior to the PISA test.

The slide show prepared by Andreas Schleicher of the OECD suggest that there are larger questions about what sorts of skills should we be teaching in the coming age of automation. The second slide says “The kind of things that are easy to teach are now easy to automate, digitize or outsource.” A slide titled The Race between Technology and Education (title from work by Goldin and Katz) suggests that there is social pain when technology isn’t matched with education. The conclusion is that we need education for a world where many jobs can be automated. Just as the industrial revolution caused social pain in the form of dislocation and unemployment, so too could AI.

Journal of the Japanese Association for Digital Humanities

September 2nd, 2015

Announcing the first issue of the Journal of the Japanese Association for Digital Humanities. I am on the Editorial board of the Journal, but the real credit goes to Charles Muller, Christian Wittern and Kiyonori Nagasaki who are the working editors. This journal represents the maturing of the Japanese digital humanities scene. They have a Japanese Association (JADH) which was founded in 2011, and became constituent organization of ADHO in 2013. Now they have a journal. As Charles Muller, Editor-in-Chief, puts it in his “Dear Readers”,

While Digital Humanities has been practiced in Japan for more than two decades, up to now, little is known outside of Japan regarding the content of Japan advancements in this field. We therefore aim to rectify this situation by initiating a first-tier peer reviewed international journal published in English. Although we hope to be able to shed light on projects in developments in Japan, we will be accepting article submissions from DH practitioners around the world on a broad range of topics.

Digital Pedagogy Institute

August 24th, 2015


Robert Jay Glickman and Geoffrey Rockwell

Last week I participated in the Digital Pedagogy Institute that was organized by the University of Toronto Scarborough, Brock University and Ryerson University. I kept my Conference Report here.

This Institute focused not only technology in learning but also on important issues around the ethics of different learning models that involve technology. Ways of using technology to get active participation rather than just broadcasting video came up. Ways of thinking about students in collaborative projects came up – we need to get beyond the apprentice model and think of them as “citizen scholars.”

Read the rest of this entry »

Metropolis II by Chris Burden (the movie) – YouTube

August 18th, 2015

From the panopticonopolis tumblr I’ve discovered Metropolis II by Chris Burden. What an interesting take on the city.

Panopticonopolis (try saying it) by Misha Lepetic has mostly entries on cities, some of which appear in 3 Quarks Daily. Another article on The Forgotten Archipelago asks what happened to the Soviet ZATP cities – the special purpose, closed and hidden cities set up for secret research. What happened when the Soviet Union collapsed and the federal government could no longer fund these single-purpose cities?

I was led to the panopticonopolis from an article on Blob Justice, Part 1 which looks at the herd shaming that is taking place on the Internet starting with Cecil the lion. I can’t help wondering if this sort of Internet stampede is related to gamergate and Anonymous.

Photos of Australia

August 11th, 2015


I’ve put up a number of photographs of Australia on my Geoffrey Rockwell’s Flickr account. Here are the albums if you want to look:

Spanish Cops Use New Law To Fine Facebook Commenter For Calling Them ‘Slackers’

August 10th, 2015

Heather tweeted me a link to a story from Techdirt on how Spanish Cops Use New Law To Fine Facebook Commenter For Calling Them ‘Slackers’. The police in Spain can now fine people for disrespecting them. This outrageous law was also reported on by The Telegraph in a story First victim of Spain’s 'gag law' fined for criticising 'lazy' police. Despite Snowden’s revelations governments seem to be passing more and more laws to restrict speech and travel, often in the name of fighting terrorism. As Techdirt reports, the law is being defended with Orwellian arguments,

Defending the new law, the PP government has said that “demonstrations will become freer because they will be protected from violent elements”. (Quote from Telegraph article)

Medical Privacy Under Threat in the Age of Big Data

August 10th, 2015

The Intercept has a good introductory story about Medical Privacy Under Threat in the Age of Big Data. I was surprised how valuable medical information is. Here is a quote:

[h]e found a bundle of 10 Medicare numbers selling for 22 bitcoin, or $4,700 at the time. General medical records sell for several times the amount that a stolen credit card number or a social security number alone does. The detailed level of information in medical records is valuable because it can stand up to even heightened security challenges used to verify identity; in some cases, the information is used to file false claims with insurers or even order drugs or medical equipment. Many of the biggest data breaches of late, from Anthem to the federal Office of Personnel Management, have seized health care records as the prize.

The story mentions Latanya Sweeny, who is the Director of the Data Privacy Lab at Harvard. She did important research on Discrimination in Online Ad Delivery and has a number of important papers on health records like a recent work on Matching Known Patients to Health Records in Washington State Data that showed that how one could de-anonymize Washington State health data that is for sale by search news databases. We are far more unique than we think we are.

I should add that I came across an interesting blog post by Dr Sweeny on Tech@FTC arguing for an interdisciplinary field of Technology Science. (Sweeny was the Chief Technologist at the FTC.)

Towards monocultural (digital) Humanities?

July 19th, 2015

Domenico Fiormonte has written a nice essay on how the humanities (and digital humanities) run the risk of becoming monolingual, Towards monocultural (digital) Humanities?. The essay is a response to Greg Crane’s The Big Humanities, National Identity and the Digital Humanities in Germany and Greg responds then to Domenico here. The numbers are depressing (see the graphs from Domenico above). As he puts it (drawing on research with a colleague into DH journals):

These data show that the real problem is not that English is the dominant language of academic publications (and of DH), but that both Anglophone and a high percentage of non-Anglophone colleagues barely use/quote non-Anglophone sources in their research.

I can’t help thinking that the internet has allowed the big to get even bigger. The dominance of English in academic circles is exacerbated by the instant availability of English research. National languages don’t even have location as an advantage on the internet.

What can we do about it? Miran had a nice reply on Humanist (to the original posting by Greg Crane that was also on Humanist.) Domenico suggests that we all have to take some responsibility, especially those of us who have the “free ride” of being native English writers.

It is the responsibility of dominant languages and cultures to translate from marginal or less influential languages.