Brianna Wu appalled at FBI’s #GamerGate investigative report

Screenshot of text from FBI Report
From FBI #GamerGate Report

The FBI has released their report on #GamerGate after a Freedom Of Information request and it doesn’t seem that they took the threats that seriously. According to a Venturebeat story Brianna Wu (is) appalled at FBI’s #GamerGate investigative report.

Wu, who is running for Congress, said in an email that she is “fairly livid” because it appears the FBI didn’t check out many of her reports about death threats. Wu catalogued more than 180 death threats that she said she received because she spoke out against sexism in the game industry and #GamerGate misogyny that eventually morphed into the alt-right movement and carried into the U.S. presidential race.

It sounds like the FBI either couldn’t trace the threats or they didn’t think they were serious enough and eventually closed down the investigation. In the aftermath of the shooting at the Québec City mosque we need to take the threats of trolls more seriously as Anita Sarkeesian did when she was threatened with a “Montreal Massacre style attack” before speaking at the University of Utah. Yes, only a few act on their threats, but threats piggy-back on the terror to achieve their end. Those making the threats may justify it as just for the lulz, but they do so knowing that some people act on their threats.

On another point, having just given a paper on Palantir I was intrigued to read that the FBI used it in their investigation. The report says that “A search of social media logins using Palantir’s search around feature revealed a common User ID number for two of the above listed Twitter accounts, profiles [Redacted] … A copy of the Palantir chart created from the Twitter results will be uploaded to the case file under a separate serial.” One wonders how useful connecting to Twitter accounts to one ID is.

Near the end of the report, which is really just a collection of redacted documents, there is a heavily redacted email from one of those harassed where all but a couple of lines are left for us to read including,

We feel like we are sending endless emails into the void with you.

2016 Chicago Colloquium On Digital Humanities And Computer Science

I’ve just come back from the Chicago Colloquium on Digital Humanities and Computer Science at the University of Illinois, Chicago. The Colloquium is a great little conference where a lot of new projects get shown. I kept conference notes on the Colloquium here.

I was struck by the number of sessions of papers on mapping projects. I don’t know if I have ever seen so many geospatial projects. Many of the papers talked about how mapping is a different way of analyzing the data whether it is the location of eateries in Roman Pompeii or German construction projects before 1924.

I gave a paper on “Information Wants to Be Free, Or Does It? Ethics in the Digital Humanities.”

Making Algorithms Accountable

ProPublica has a great op-ed about Making Algorithms Accountable. The story starts from a decision from the Wisconsin Supreme Court on computer-generated risk (of recidivism) scores. The scores used in Wisconsin come from Northpointe who provide the scores as a service based on a proprietary alogorithm that seems biased against blacks and not that accurate. The story highlights the lack of any legislation regarding algorithms that can affect our lives.

Update: ProPublica has responded to a Northpointe critique of their findings.

They know (on surveillance)

They know is a must see design project by Christian Gross from the Interface Design Programme at University of Applied Sciences in Potsdam (FHP), Germany. The idea behind the project, described in the They Know showcase for FHP, is,

I could see in my daily work how difficult it was to inform people about their privacy issues. Nobody seemed to care. My hypothesis was that the whole subject was too complex. There were no examples, no images that could help the audience to understand the process behind the mass surveillance.

The answer is to mock up a design fiction of an NSA surveillance dashboard based on what we know and then a video describing a fictional use of it to track an architecture student from Berlin. It seems to me the video and mock designs nicely bring together a number of things we can infer about the tools they have.

Information Geographies

Thanks to a note from Domenico Fiormonte to Humanist I came across the Information Geographies page at the Oxford Internet Institute. The OII has been producing interesting maps that show aspects of the internet. The one pictured above shows the distribution of Geographic Knowledge in Freebase. Given the importance of Freebase to Google’s Knowledge Graph it is important to understand the bias of its information to certain locations.

Geographic content in Freebase is largely clustered in certain regions of the world. The United States accounts for over 45% of the overall number of place names in the collection, despite covering about 2% of the Earth, less than 7% of the land surface, and less than 5% of the world population, and about 10% of Internet users. This results in a US density of one Freebase place name for every 1500 people, and far more place names referring to Massachusetts than referring to China.

Domenico Fiormonte’s email to Humanist (Humanist Discussion Group, Vol. 29, No. 824) argues that “It is our responsibility to preserve cultural diversity, and even relatively small players can make a difference by building more inclusive ‘representations’.” He argues that we need to be open about the cultural and linguistic biases of the tools and databases we build.

Paolo Sordi: I blog therefore I am

I_am_remix
On the ethos of digital presence: I participated today in a panel launching the Italian version of Paolo Sordi’s book I Am: Remix Your Web Identity. (The Italian title is Bloggo Con WordPress Dunque Sono.) The panel included people like Domenico Fiormonte, Luisa Capelli, Daniela Guardamangna, Raul Mordenti, and, of course, Paolo Sordi.

Continue reading Paolo Sordi: I blog therefore I am

Big computers, big hair: the women of Bell Labs in the 1960s

Picture of Bea

The Guardian has posted a set of pictures by Larry Luckham who took a camera into work in 1967 to take pictures of life at Bell Labs, see Big computers, big hair: the women of Bell Labs in the 1960s. That the collection is entirely of women raises some questions. As the Slashdot article post that pointed me to this collection puts it:

What’s noticeable about the pictures, is that they are of woman. I don’t think this is a result of the photographer just photographing “eye candy”. I think it’s because he was surrounded by women, whom from his comments he very much respected and hence photographed.

In those times, wrangling with a computer was very much seen as “clerical work” and therefore the domain of woman. This can be seen as far back as Bletchley Park and before that Ada Lovelace.

Yet 50 years later, the IT industry has turned full-circle. Look at any IT company and the percentage of women doing software development or similar is woeful. Why and how has this happened? Discuss.