Matthew Wilkens has posted a nice blog essay about his short MLA paper on geography and memory, Literary Attention Lag. He looked at how some cities get far more literary attention than their population merits despite a general correlation between population and attention. For example, in 1860 Chicago and New Orleans had about the same population, but New Orleans gets a lot more attention.
What is particularly useful is that he provides an iPython notebook with a documented version of his code here. He also provides a link to his data so you can edit and recapitulate his study.
Stéfan Sinclair and I are experimenting with Mathematica notebooks and iPython notebooks as a way to share research thinking with code woven in.
Yesterday I gave a talk over the internet on “What’s New is Old Again: Studying Interface with Perseus.” This talk was recorded and shared on via eHumanities Seminar – YouTube.
The abstract I submitted for the talk was:
[P]aradoxically, the primary effect of visual forms of knowledge production in any medium – the codex boo, digital interface, information visualizations, virtual renderings, or screen displays – is to mask the very fact of their visuality … (Johanna Drucker, Graphesis, p. 10)
Interfaces don’t get much scholarly attention because they are seen as an ephemeral presentation layer masking the real information the way the design of a book holds the content. This paper will discuss a series of projects that take interface seriously and historically. These projects were undertaken by the Interface Design team of the INKE project to find ways of studying the evolution of an interface. These projects used the Perseus project as a test case as it is one of the oldest continuous projects in the digital humanities. The presentation will argue that:
- There is a history to digital interfaces that is rich and interesting enough to study.
- We need to theorize about how to do the history of interface. Heroic design stories are not enough.
- We need to act now to preserve traces of interfaces for study and that there are better and worse ways of preparing for preservation.
The presentation will conclude by showing the architecture developed for an archive of Perseus interfaces designed for future study.
One of the outcomes of the Charlie Hebdo attack is that politicians are using the terrorist attacks to call for more intrusive surveillance legislation. For example the BBC reports that UK Prime Minister David Cameron says new online data laws needed. Gibbs and Hern for the Guardian interpret Cameron as calling for “anti-terror laws to give the security services the ability to read encrypted communications in extreme circumstances.” (David Cameron in ‘cloud cuckoo land’ over encrypted messaging apps ban, Jan. 13, 2015) This would mean that either back doors are built into communications technologies with encryption or the technologies are banned in the UK.
Needless to say all sorts of people are responding to these calls for new legislation by pointing out the dangers of deliberately crippling encryption. If there are back doors they can be found and used by criminals which will mean that all sorts of companies that need/offer strong encryption will move out of the UK. For that matter, what would this mean for the use of global systems that might have encryption. (See James Ball’s article in the Guardian, Cameron wants to ban encryption – he can say goodbye to digital Britain, Jan. 13, 2015).
What few people are commenting on is the effectiveness of SIGINT (signals intelligence) in cases like the attacks in Paris. Articles in The Globe and Mail and the Guardian suggest that a combination of human intelligence and early interventions would be more likely to make a difference. The alleged culprits were known to all sorts of people (neighbours, people at their mosque, police). The problem was how difficult it is to know what to do with that information and when to intervene. This is a human problem not a signals intelligence problem. SIGINT could just add to the noise without guiding authorities as to how to deal with people.
To be honest I don’t know what would work, and perhaps predictive analytics, for all its problems, could be part of identifying at-risk youth early so that they are not thrown together in prison (as the Paris attackers were) and so interventions could be organized. Nonetheless, we clearly need more studies of the circumstances of those that are radicalized and we need to seriously try to intervene in positive ways. The alternative is arresting people for intents which are very hard to prove and has all sorts of problems as an approach.
We also need research and discussion about the balance of approaches, something that is impossible as long as surveillance is inaccessible to any oversight and accountability. Who would know if funding was better spent on human approaches? Who would dare cut the budget to nice clean modern digital intelligence in favour of a messy mix of human approaches? How to compare approaches that are hard to measure given the thankfully small numbers of incidents?
- Angelique Chrisafis has one of the best articles on the backgrounds of the attackers in the Guardian titled Charlie Hebdo attackers: born, raised and radicalised in Paris, (Jan. 12, 2015)
- The Centre de prévention contre les dérives sectaires liées à l’islam (in French) is a research centre that has been studying radicalization in France. The Director, Dounia Bouzar has been quoted in various stories that look more deeply into the issue. See, for example, Remi Peit’s French jihadists in Syria and Cyber-indoctrination (Aljazeera, Apr. 24, 2014).
- Mark MacKinnon has an article in The Globe and Mail about how a Neighbour says suspects in Paris shooting had ‘cache of arms’ (Jan. 8, 2015).
- The International Centre for the Study of Radicalisation and Political Violence issued a study in 2012 on Countering Radicalization in Europe that details the approaches taken in different European countries. Check out Denmark for a comprehensive approach.
And … we need to be able to talk openly about the issues without fear – Je suis Charlie
The Toronto Star has a nice story, The computer program billed as unbeatable at poker, about a poker playing program Cepehus that was developed at the Computer Poker Research here at the University of Alberta. Michael Bowling is quoted to the effect that,
No matter what you do, no matter how strong a player you are, even if you look at our strategy in every detail . . . there is no way you are going to be able of have any realistic edge on us.
On average we are playing perfectly. And that’s kind of the average that really matters.
You can play Cepehus at their web site. You can read their paper “Heads-up limit hold’em poker is solved”, just published in Science here (DOI: 10.1126/science.1259433).
Thanks to a note from Willard on Humanist I came across this essay in the London Review of Books, Andrew O’Hagan · The Lives of Ronald Pinn (LRB 8 January 2015). The author decided to develop a false identity and “legend” by using the name of a dead person (Ronald Pinn) who was born around the time he was. This was in response to stories about how UK police had been going undercover since 1968 to infiltrate political groups. The police had been bringing identities back to life so O’Hagan decided to try it. In the process he explored a lot of the dark web including ordering drugs from the Silk Road, ordering guns, getting false IDs and so on.
The essay or biography is well written and poignant. Just before ends the legendary Pinn he meets the original’s mother.
‘Oh, Ronnie,’ she said. ‘There was nobody like him.’
The Guardian has an essay by Terry Eagleton on The death of universities. The article asks (and answers),
Are the humanities about to disappear from our universities? The question is absurd. It would be like asking whether alcohol is about to disappear from pubs, or egoism from Hollywood. Just as there cannot be a pub without alcohol, so there cannot be a university without the humanities. If history, philosophy and so on vanish from academic life, what they leave in their wake may be a technical training facility or corporate research institute. But it will not be a university in the classical sense of the term, and it would be deceptive to call it one.
I wish I were so sure of this logical argument, but I fear that people are quite willing to call something a university even without many of the humanities just as the university in centuries past was just as much a university for not having many of the fields now seen as essential (like Computer Science, Cognitive Science, Bioinformatics, even Engineering.)
I can imagine a university where many of the humanities end up in the Faculty of Education (which does prepare people for jobs as teachers.) We would have the department of English Education, for example. Would people bemoan the loss of the humanities if many of its questions ended up housed elsewhere?
For that matter there are some that argue that preserving the humanities may be a cloak for preserving a particular idea of humanism. For example, here is Tony Davies at the end of his excellent short book Humanism:
All humanisms, until now, have been imperial. They speak of the human in the accents and the interests of a class, a sex, a race, a genome. Their embrace suffocates those whom it does not ignore. (p. 141; location 2372 in Kindle)
To claim that a university would not be a university if it didn’t maintain a particular collection of intellectual traditions would be begging the question (actually begging all sorts of questions). We simply can’t expect a historical definition to save what we care for. We must be part of the ongoing definition whether as collaborators or critics, which raises the question of how far to collaborate and when to dig in heels and yell like hell?
Last week I gave a talk for the UNIty in diVERSITY speaker series on “Big Data in the Humanities.” They have now put that up on Vimeo. The talk looked at the history of reading technologies and then some of the research at U of Alberta we are doing around issues of what to do with all that big data.
A paper I am a co-author on just came out through Scholarly and Research Communication (Vol. 5, No. 4, 2014). It is titled The Provision of Digital Apparatus for Use in Experimental Interfaceson and Stan Ruecker led the work. It is a nice article that shows a number of prototypes we have developed (actually I only contributed to a couple, but Stan led them.)