NSA to shut down bulk phone surveillance program by Sunday. A first step.
Archive for the ‘Internet Culture and Technology’ Category
Today I deposited a Data Management Plan Recommendation for Social Science and Humanities Funding Agencies (http://hdl.handle.net/10402/era.42201) in our institutional repository ERA. This report/recommendation was written by Sonja Sapach with help from me and Catherine Middleton. We recommended that:
Agencies that fund social science and humanities (SSH) research should move towards requiring a Data Management Plan (DMP) as part of their application processes in cases where research data will be gathered, generated, or curated. In developing policies, funding agencies should consult the community on the values of stewardship and research that would be strengthened by requiring DMPs. Funding agencies should also gather examples and data about reuse of archived data in the social sciences and humanities and encourage due diligence among researchers to make themselves aware of reusable data.
On the surface the recommendation seems rather bland. SSHRC has required the deposit of research data they fund for decades. The problem, however, is that few of us pay attention because it is one more thing to do, and something that shares hard-won data with others that you may want to continue milking for research. What we lack is a culture of thinking of the deposit of research data as a scholarly contribution the way the translation and edition of important cultural texts is. We need a culture of stewardship as a TC3+ (tri-council) document put it. See Capitalizing on Big Data: Toward a Policy Framework for Advancing Digital Scholarship in Canada (PDF).
Given the potential resistance of colleagues it is important that we understand the arguments for requiring planning around data management and that is one of the things we do in this report. Another issue is how to effectively require at the funding proposal end something (like a Data Management Plan) that would show how the researchers are thinking through the issue. To that end we document the approaches of other funding bodies. The point is that this is not actually that new and some research communities are further ahead.
At the end of the day, what we really need is a recognition that depositing data so that it can be used by other researchers is a form of scholarship. Such scholarship can be assessed like any other scholarship. What is the data deposited and what is its quality? How is the data deposited? How is it documented? Can it have an impact?
What Ever Happened to Project Bamboo? by Quinn Dombrowski is one of the few honest discussions about the end of a project. I’ve been meaning to blog this essay which follows on her important conference paper at DH 2013 in Nebraska (see my conference report here which comments on her paper.) The issue of how projects fail or end rarely gets dealt with and Dombrowski deserves credit for having the courage to document the end of a project that promised so much.
I blog about this now as I just finished a day-long meeting of the Leadership Council for Digital Infrastructure where we discussed a submission to Industry Canada that calls for coordinated digital research infrastructure. While the situation is different, we need to learn from projects like Bamboo when we imagine massive investment in research infrastructure. We all know it is important, but doing it right is not as easy as it sounds.
Which brings me back to failure. There are three types of failure:
- The simple type we are happy to talk about where you ran an experiment based on a hypothesis and didn’t get positive results. This type is based on a simplistic model of the scientific process which pretends to value negative results as much as positive ones. We all know the reality is not that simple and, for that matter, that the science model doesn’t really apply to the humanities.
- The messy type where you don’t know why you failed or what exactly failed. This is the type where you promised something in a research or infrastructure proposal and didn’t deliver. This type is harder to report because it reflects badly on you. It is an admission that you were confused or oversold your project.
- The third and bitter type is the project that succeeds on its own terms, but is surpassed by the disciplines. It is when you find your research isn’t current any longer and no one is interested in your results. It is when you find yourself ideologically stranded doing something that someone important has declared critically flawed. It is a failure of assumptions, or theory, or positioning and no one wants to hear about this failure, they just want to avoid it.
When people like Willard McCarty and John Unsworth call for a discussion of failure in the digital humanities they describe the first type, but often mean the second. The idea is to describe a form of failure reporting similar to negative results – or to encourage people to describe their failure as simply negative results. What we need, however, is honest description of the second and third types of failure, because those are expensive. To pretend some expensive project that slowly disappeared in missunderstanding was simply an experiment is missing what was at stake. This is doubly true of infrastructure because infrastructure is not supposed to be experimental. No one pays for roads and their maintenance as an experiment to see if people will take the road. You should be sure the road is needed before building.
Instead, I think we need to value research into infrastructure as something independent of the project owners. We need to do in Canada what the NSF did – bring together research on the history and theory of infrastructure.
Panopticonopolis (try saying it) by Misha Lepetic has mostly entries on cities, some of which appear in 3 Quarks Daily. Another article on The Forgotten Archipelago asks what happened to the Soviet ZATP cities – the special purpose, closed and hidden cities set up for secret research. What happened when the Soviet Union collapsed and the federal government could no longer fund these single-purpose cities?
I was led to the panopticonopolis from an article on Blob Justice, Part 1 which looks at the herd shaming that is taking place on the Internet starting with Cecil the lion. I can’t help wondering if this sort of Internet stampede is related to gamergate and Anonymous.
Reading a paper by Lev Manovich I came across a reference to the web site WorldWideWebSize.com which graphs the size of the World Wide Web. The web site searches Google and Bing daily for different words from a corpus and then uses the total results to estimate the size of the web.
When you know, for example, that the word ‘the’ is present in 67,61% of all documents within the corpus, you can extrapolate the total size of the engine’s index by the document count it reports for ‘the’. If Google says that it found ‘the’ in 14.100.000.000 webpages, an estimated size of the Google’s total index would be 23.633.010.000.
In the screen grab above you can see that the estimated size can change dramatically over time. Hard to tell why.
CBC and others are reporting on a new Nintendo Creators Program where Nintendo will take a percentage of the ad revenue associated with a YouTube channel or video with playthroughs (Let’s Play) of their games. See YouTube gaming stars blindsided by Nintendo’s ad revenue grab or Nintendo’s New Deal with Youtubers Is A Jungle Of Rights. This will
In the past, advertising proceeds that could be received for videos that included Nintendo-copyrighted content (such as gameplay videos) went to Nintendo, according to YouTube rules. Now, through this service, Nintendo will send you a share of these advertising proceeds for any YouTube videos or channels containing Nintendo-copyrighted content that you register.
This program is only for “copyrighted content related to game titles specified by Nintendo”. This is probably because Nintendo has to be careful to not be seen as making money off playthroughs of other publisher’s games.
This new policy/program raises interesting issues around:
- Fair use. Is a screen shot or a whole series of them that make up a playthrough covered by “fair use”? My read is that the publishers think not.
- Publicity from Playthroughs. YouTuber’s like PewDiePie who post Let’s Play videos (and make money off their popular channels) argue that these videos provide free exposure and publicity.
- New Economic Models for Gaming. Is Nintendo exploring new economic models tied to their copyright? Nintendo has been suffering so it makes sense that they would try to find ways to monetize their significant portfolio of popular game franchises and characters.
Ars Technical has a series of interesting articles about doxing including an article about how the Islamic State doxes US soldiers, airmen, calls on supporters to kill them . How long before IS starts identifying the Canadian special forces sent to advise in the war in Iraq and Syria. Or … imagine the doxing of drone operators as a form of retaliation.
Doxing and other troll tactics seem to be entering the mainstream. Gabriella Coleman in Hacker, Hoaxer, Whistleblower, Spy writes about Anonymous and their use of various tactics for often admirable causes. She goes further and suggests that trolling may be form of resistance suited to the emerging surveillance state,
Anonymous is emblematic of a particular geography of resistance. Composed of multiple competing groups, short-term power is achievable for brief durations, while long-term dominance by any single group or person is virtually impossible. In such a dynamic landscape, it may be “easy to co-opt, but impossible to be co-opted,” (location 5691 of 8131)
She also sees in Anonymous and trolling the tradition of the trickster. “Trickster tales are not didactic and moralizing but reveal their lessons playfully.” (Location 511 of 8131) It wasn’t long before the tricksters got attacked as the tactics spread. See Dox everywhere: LulzSec under attack from hackers, law enforcement.
The GamerGate controversy showed a much darker side to trolling and how these tactics could be used to bully as much as to resist. The people doxed were mostly women and so-called “social justice warriors” who annoyed certain gamers. Those doxed were hardly the powerful or Big Brother watching us. Now (women) academics who study gaming are being identified. How long before we have to train our graduate students in Anti-doxing strategy as part of preparation for research into games?
Geist’s point is that oversight is not enough. Those who now provide oversight have come out to say that they are on the job and that the CSE’s activities are legal. That means that oversight isn’t really working. The surveillance organizations and those tasked with oversight seem to be willfully ignoring the interpretation of experts that the gathering and sharing of metadata is the gathering and sharing of information about Canadians.
He talked about how C-51 affects privacy allowing information sharing way beyond what is needed for counter-terrorism. C-51 puts in place a legal framework for which no amount of oversight will make a difference. C-51 allows information to be shared between agencies about “activities that undermine the security of Canada.” An opinion piece in the Toronto Star by Craig Forcese and Kent Roach of antiterrorlaw.ca suggests that this could be interpreted as license to spy on students protesting tuition fees without municipal permission, eco-activists protesting illegally and so on.
Ars Technica has a good article on Cybergeddon: Why the Internet could be the next “failed state”. The article all the ways the internet is being abused (from porn to the theft of information.) The article starts by reminding us of all the abuse on the internet from revenge porn to the theft of personal information. It then summarizes a paper by Jason Healey, The Five Futures of Cyber Conflict and Cooperation that outlines five possible cyber futures from the unlikely Paradise to Status Quo, Domain (where cyberspace is a domain like any other for conflict), Balkanization, and Cybergeddon.
One wonders what the futures for cyberspace for the academy are. Here are my speculative futures:
- Balkanization: universities create their own internets (intranets?) to keep out the great unwashed. Alumnae get to keep their university email addresses if they behave. The elite universities (like the University of Alberta) then create a ivory tower subnet where only the important hang.
- Cybergeddon: trolls drive academics off the internet as we are all Social Justice Warriors who should be doxxed, swatted, and watched. Risk management takes over and academics are not allowed on the internet without grant-funded insurance.
- Paradise: universities finally succeed is teaching ethics reliably and the world is made a better place. Philosopher rulers are put in charge. The internet becomes the nice safe place it was originally. Microsoft goes out of business, but wills Bob to the internet to be its AI policeperson.
The Intercept and CBC have been collaborating on stories based on documents leaked by Edward Snowden. One recent story is about how Canadian Spies Collect Domestic Emails in Secret Security Sweep. CSE is collecting email going to the government and flagging suspect emails for analysts.
An earlier story titled CSE’s Levitation project: Expert says spy agencies ‘drowning in data’ and unable to follow leads, tells about the LEVITATION project that monitors file uploads to free file hosting sites. The idea is to identify questionable uploads and then to figure out who is uploading the materials.
Glenn Greenwald (see the embedded video) questions the value of this sort of mass surveillance. He suggests that mass surveillance impedes the ability to find terrorists attacks. The problem is not getting more information, but connecting the dots of what one has. In fact the slides that you can get to from these stories both show that CSE is struggling with too much information and analytical challenges.