Making Algorithms Accountable

ProPublica has a great op-ed about Making Algorithms Accountable. The story starts from a decision from the Wisconsin Supreme Court on computer-generated risk (of recidivism) scores. The scores used in Wisconsin come from Northpointe who provide the scores as a service based on a proprietary alogorithm that seems biased against blacks and not that accurate. The story highlights the lack of any legislation regarding algorithms that can affect our lives.

Update: ProPublica has responded to a Northpointe critique of their findings.

They know (on surveillance)

They know is a must see design project by Christian Gross from the Interface Design Programme at University of Applied Sciences in Potsdam (FHP), Germany. The idea behind the project, described in the They Know showcase for FHP, is,

I could see in my daily work how difficult it was to inform people about their privacy issues. Nobody seemed to care. My hypothesis was that the whole subject was too complex. There were no examples, no images that could help the audience to understand the process behind the mass surveillance.

The answer is to mock up a design fiction of an NSA surveillance dashboard based on what we know and then a video describing a fictional use of it to track an architecture student from Berlin. It seems to me the video and mock designs nicely bring together a number of things we can infer about the tools they have.

Information Geographies

Thanks to a note from Domenico Fiormonte to Humanist I came across the Information Geographies page at the Oxford Internet Institute. The OII has been producing interesting maps that show aspects of the internet. The one pictured above shows the distribution of Geographic Knowledge in Freebase. Given the importance of Freebase to Google’s Knowledge Graph it is important to understand the bias of its information to certain locations.

Geographic content in Freebase is largely clustered in certain regions of the world. The United States accounts for over 45% of the overall number of place names in the collection, despite covering about 2% of the Earth, less than 7% of the land surface, and less than 5% of the world population, and about 10% of Internet users. This results in a US density of one Freebase place name for every 1500 people, and far more place names referring to Massachusetts than referring to China.

Domenico Fiormonte’s email to Humanist (Humanist Discussion Group, Vol. 29, No. 824) argues that “It is our responsibility to preserve cultural diversity, and even relatively small players can make a difference by building more inclusive ‘representations’.” He argues that we need to be open about the cultural and linguistic biases of the tools and databases we build.

Paolo Sordi: I blog therefore I am

I_am_remix
On the ethos of digital presence: I participated today in a panel launching the Italian version of Paolo Sordi’s book I Am: Remix Your Web Identity. (The Italian title is Bloggo Con WordPress Dunque Sono.) The panel included people like Domenico Fiormonte, Luisa Capelli, Daniela Guardamangna, Raul Mordenti, and, of course, Paolo Sordi.

Continue reading Paolo Sordi: I blog therefore I am

Big computers, big hair: the women of Bell Labs in the 1960s

Picture of Bea

The Guardian has posted a set of pictures by Larry Luckham who took a camera into work in 1967 to take pictures of life at Bell Labs, see Big computers, big hair: the women of Bell Labs in the 1960s. That the collection is entirely of women raises some questions. As the Slashdot article post that pointed me to this collection puts it:

What’s noticeable about the pictures, is that they are of woman. I don’t think this is a result of the photographer just photographing “eye candy”. I think it’s because he was surrounded by women, whom from his comments he very much respected and hence photographed.

In those times, wrangling with a computer was very much seen as “clerical work” and therefore the domain of woman. This can be seen as far back as Bletchley Park and before that Ada Lovelace.

Yet 50 years later, the IT industry has turned full-circle. Look at any IT company and the percentage of women doing software development or similar is woeful. Why and how has this happened? Discuss.

When Women Stopped Coding

The NPR show Planet Money aired a show in 2014 on When Women Stopped Coding that looks at why the participation of women in computer science changed in 1984 after rising for a decade. Unlike other professional programs like medical school and law school, the percent participation of women when from about 37% in 1984 down to under 20% today. The NPR story suggests that the problem is the promotion of the personal computer at the moment when it became affordable. In the 1980s they were heavily marketed to boys which meant that far more men came to computer science in college with significant experience with computing, something that wasn’t true in the 70s when there weren’t that many computers in the home and math is what mattered. The story builds on research by Jane Margolis and in particular her book Unlocking the Clubhouse.

This fits with my memories of the time. I remember being jealous of the one or two kids who had Apple IIs in college (in the late 70s) and bought an Apple II clone (a Lemon?) as soon has I had a job just to start playing with programming. At college I ended up getting 24/7 access to the computing lab in order to be able to use the word processing available (a Pascal editor and Diablo daisy wheel printer for final copy.) I hated typing and retyping my papers and fell in love with the backspace key and editing of word processing. I also remember the sense of comradery among those who spent all night in the lab typing papers in the face of our teacher’s mistrust of processed text. Was it coincidence that the two of us who shared the best senior thesis prize in philosophy in 1892 wrote our theses in the lab on computers? What the story doesn’t deal with, that Margolis does, is the homosocial club-like atmosphere around computing. This still persists. I’m embarrassed to think of how much I’ve felt a sense of belonging to these informal clubs without asking who was excluded.

Speak Up & Stay Safe(r): A Guide to Protecting Yourself From Online Harassment

Feminist Frequency has posted an excellent Speak Up & Stay Safe(r): A Guide to Protecting Yourself From Online Harassment. This is clearly written and thorough discussion of how to protect yourself better from the sorts of harassment Anita Sarkeesian has documented in blog entries like Harassment Through Impersonation: The Creation of a Cyber Mob.

As the title suggests the guide doesn’t guarantee complete protection – all you can do is get better at it. The guide is also clear that it is not for protection against government surveillance. For those worried about government harassment they provide links to other resources like the Workbook on Security.

In her blog entry announcing the guide, Anita Sarkeesian explains the need for this guide thus and costs of harassment thus:

Speak Up & Stay Safe(r): A Guide to Protecting Yourself From Online Harassment was made necessary by the failure of social media services to adequately prevent and deal with the hateful targeting of their more marginalized users. As this guide details, forcing individual victims or potential targets to shoulder the costs of digital security amounts to a disproportionate tax of in time, money, and emotional labor. It is a tax that is levied disproportionately against women, people of color, queer and trans people and other oppressed groups for daring to express an opinion in public.

How did we get to this point? What happened to the dreams of internet democracy and open discourse? What does it say about our society that such harassment has become commonplace? What can we do about it?

Medical Privacy Under Threat in the Age of Big Data

The Intercept has a good introductory story about Medical Privacy Under Threat in the Age of Big Data. I was surprised how valuable medical information is. Here is a quote:

[h]e found a bundle of 10 Medicare numbers selling for 22 bitcoin, or $4,700 at the time. General medical records sell for several times the amount that a stolen credit card number or a social security number alone does. The detailed level of information in medical records is valuable because it can stand up to even heightened security challenges used to verify identity; in some cases, the information is used to file false claims with insurers or even order drugs or medical equipment. Many of the biggest data breaches of late, from Anthem to the federal Office of Personnel Management, have seized health care records as the prize.

The story mentions Latanya Sweeny, who is the Director of the Data Privacy Lab at Harvard. She did important research on Discrimination in Online Ad Delivery and has a number of important papers on health records like a recent work on Matching Known Patients to Health Records in Washington State Data that showed that how one could de-anonymize Washington State health data that is for sale by search news databases. We are far more unique than we think we are.

I should add that I came across an interesting blog post by Dr Sweeny on Tech@FTC arguing for an interdisciplinary field of Technology Science. (Sweeny was the Chief Technologist at the FTC.)