Artificial intelligence: Commission takes forward its work on ethics guidelines

The European Commission has announced the next step in its Artificial Intelligence strategy. See Artificial intelligence: Commission takes forward its work on ethics guidelines. The appointed a High-Level Expert Group in June of 2018. This group has now developed Seven essentials for achieving trustworthy AI:

Trustworthy AI should respect all applicable laws and regulations, as well as a series of requirements; specific assessment lists aim to help verify the application of each of the key requirements:

  • Human agency and oversight: AI systems should enable equitable societies by supporting human agency and fundamental rights, and not decrease, limit or misguide human autonomy.
  • Robustness and safety: Trustworthy AI requires algorithms to be secure, reliable and robust enough to deal with errors or inconsistencies during all life cycle phases of AI systems.
  • Privacy and data governance: Citizens should have full control over their own data, while data concerning them will not be used to harm or discriminate against them.
  • Transparency: The traceability of AI systems should be ensured.
  • Diversity, non-discrimination and fairness: AI systems should consider the whole range of human abilities, skills and requirements, and ensure accessibility.
  • Societal and environmental well-being: AI systems should be used to enhance positive social change and enhance sustainability and ecological responsibility.
  • Accountability: Mechanisms should be put in place to ensure responsibility and accountability for AI systems and their outcomes.

The next step has now been announced and that is a pilot phase that tests these essentials with stakeholders. The Commission also wants to cooperate with “like-minded partners” like Canada.

What would it mean to participate in the pilot?

Ethicists are no more ethical than the rest of us, study finds

When it comes to the crucial ethical question of calling one’s mother, most people agreed that not doing so was a moral failing.

Quartz reports on a study in Philosophical Psychology that Ethicists are no more ethical than the rest of us, study finds — Quartz. While one wonders how one can survey how ethical someone is, this is nonetheless a believable result. The contemporary university is structured deliberately not to be a place to change people’s morals, but to educate them. When we teach ethics we don’t assess or grade the morality of the student. Likewise, when we hire, promote, and assess the ethics of a philosophy professor we also don’t assess their personal morality. We assess their research, teaching and service record, all of which can be burnished without actually being ethical. There is, if you will, a professional ethic that research and teaching should not be personal, but be detached.

A focus on the teaching and learning of ethics over personal morality is, despite the appearance of hypocrisy, a good thing. We try to create in the university, in the class, and in publications, an openness to ideas, whoever they come from. By avoiding discussing personal morality we try to create a space where people of different views can enter into dialogue about ethics. Imagine what it would be like if it were otherwise? Imagine if my ethics class was about converting students to some standard of behaviour. Who would decide what that standard was? The ethos of professional ethics is one that emphasizes dialogue over action, history over behaviour, and ethical argumentation over disposition. Would it be ethical any other way?

The structure of recent philosophy (II) · Visualizations

In this codebook we will investigate the macro-structure of philosophical literature. As a base for our investigation I have collected about fifty-thousand reco

Stéfan sent me a link to this interesting post, The structure of recent philosophy (II) · Visualizations. Maximilian Noichl has done a fascinating job using the Web of Science to develop a model of the field of Philosophy since the 1950s. In this post he describes his method and the resulting visualization of clusters (see above). In a later post (version III of the project) he gets a more nuanced visualization that seems more true to the breadth of what people do in philosophy. The version above is heavily weighted to anglo-american analytic philosophy while version III has more history of philosophy and continental philosophy.

Here is the final poster (PDF) for version III.

I can’t help wondering if his snowball approach doesn’t bias the results. What if one used full text of major journals?

Google AI experiment has you talking to books

Google has announced some cool text projects. See Google AI experiment has you talking to books. One of them, Talk to Books, lets you ask questions or type statements and get answers that are passages from books. This strikes me as a useful research tool as it allows you to see some (book) references that might be useful for defining an issue. The project is somewhat similar to the Veliza tool that we built into Voyant. Veliza is given a particular text and then uses an Eliza-like algorithm to answer you with passages from the text. Needless to say, Talking to Books is far more sophisticated and is not based simply on word searches. Veliza, on the other hand can be reprogrammed and you can specify the text to converse with.

Continue reading Google AI experiment has you talking to books

The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base

The Research Director for UpGuard, Chris Vickery (@VickerySec) has uncovered code repositories from AggregateIQ, the Canadian company that was building tools for/with SCL and Cambridge Analytica. See The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base and AggregateIQ Created Cambridge Analytica’s Election Software, and Here’s the Proof from Gizmodo.

The screenshots from the repository show on project called ephemeral with a description “Because there is no such thing as THE TRUTH”. The “Primary Data Storage” of Ephemeral is called “Mamba Jamba”, presumably a joke on “mumbo jumbo” which isn’t a good sign. What is mort interesting is the description (see image above) of the data storage system as “The Database of Truth”. Here is a selection of that description:

The Database of Truth is a database system that integrates, obtains, and normalizes data from disparate sources including starting with the RNC data trust.  … This system will be created to make decisions based upon the data source and quality as to which data constitutes the accepted truth and connect via integrations or API to the source systems to acquire and update this data on a regular basis.

A robust front-end system will be built that allows an authrized user to query the Database of Truth to find data for a particular upcoming project, to see how current the data is, and to take a segment of that data and move it to the Escrow Database System. …

The Database of Truth is the Core source of data for the entire system. …

One wonders if there is a philosophical theory, of sorts, in Ephemeral. A theory where no truth is built on the mumbo jumbo of a database of truth(s).

Ephemeral would seem to be part of Project Ripon, the system that Cambridge Analytica never really delivered to the Cruz campaign. Perhaps the system was so ephemeral that it never worked and therefore the Database of Truth never held THE TRUTH. Ripon might be better called Ripoff.

When Women Stopped Coding

The NPR show Planet Money aired a show in 2014 on When Women Stopped Coding that looks at why the participation of women in computer science changed in 1984 after rising for a decade. Unlike other professional programs like medical school and law school, the percent participation of women when from about 37% in 1984 down to under 20% today. The NPR story suggests that the problem is the promotion of the personal computer at the moment when it became affordable. In the 1980s they were heavily marketed to boys which meant that far more men came to computer science in college with significant experience with computing, something that wasn’t true in the 70s when there weren’t that many computers in the home and math is what mattered. The story builds on research by Jane Margolis and in particular her book Unlocking the Clubhouse.

This fits with my memories of the time. I remember being jealous of the one or two kids who had Apple IIs in college (in the late 70s) and bought an Apple II clone (a Lemon?) as soon has I had a job just to start playing with programming. At college I ended up getting 24/7 access to the computing lab in order to be able to use the word processing available (a Pascal editor and Diablo daisy wheel printer for final copy.) I hated typing and retyping my papers and fell in love with the backspace key and editing of word processing. I also remember the sense of comradery among those who spent all night in the lab typing papers in the face of our teacher’s mistrust of processed text. Was it coincidence that the two of us who shared the best senior thesis prize in philosophy in 1892 wrote our theses in the lab on computers? What the story doesn’t deal with, that Margolis does, is the homosocial club-like atmosphere around computing. This still persists. I’m embarrassed to think of how much I’ve felt a sense of belonging to these informal clubs without asking who was excluded.

Silly Season for Eric Raymond

Eric Raymond, widely admired for his The Cathedral and the Bazaar, is now peddling social justice paranoia. See Why Hackers Must Eject the SJWs. He starts with the following,

The hacker culture, and STEM in general, are under ideological attack. Recently I blogged a safety warning that according to a source I consider reliable, a “women in tech” pressure group has made multiple efforts to set Linus Torvalds up for a sexual assault accusation. I interpreted this as an attempt to beat the hacker culture into political pliability, and advised anyone in a leadership position to beware of similar attempts.

See his “safety warning” at From kafkatrap to honeytrap. His evidence for this ideological attack seems to be gossip from trusted sources – gossip that confirms his views about “women in tech” and pressure groups and so on. This sort of war rhetoric closes any opportunity for discussion around the issues of women in technology. For Raymond it is now a (culture) war between those on the side of hacker culture and STEM, against “Social Justice Warriors” and what is at stake is the “entire civilization that we serve.”

Why are these important issues being militarized instead of aired respectfully? When did the people we live with and love become the other? Just how confident are we that we objectively know what merit is in the hurly burly of life?  What civilization is this really about?

Other reactions to this story include Linus Torvalds targeted by honeytraps, claims Eric S. Raymond in The Register and Is This the Perfect Insane Anti-Feminist Rumor? from New York Magazine.

Why empathy is the next big thing in video games

CBC Spark with Nora Young had a segment on Why empathy is the next big thing in video games. The category seems to map onto “persuasive games” or “art games.” Some of the games mentioned:

  • RIOT – a forthcoming game where you experience being in riots
  • Spirits of Spring – about a “young native in a mythical land”
  • Papo and Yo – about alchoholism

Ian Bogost talks on the segment and makes the argument that in empathy games one feels a different type of empathy than in narrative media. When you make the choices you have something at stake. He also made a point about empathy with systems that I didn’t quite get. He talked about systems oriented game design where you get exposed to a different system or environment and learn about it through playing. The idea is that by playing someone running a fast food chain you learn about the system of fast food. You learn to empathize with the fast food mogul in order to understand the constraints those systems are under.

HathiTrust Research Center Awards Three ACS Projects

A Advanced Collaborative Support project that I was part of was funded, see HathiTrust Research Center Awards Three ACS Projects. Our project, called The Trace of Theory, sets out to first see if we can identify subsets of the HathiTrust volumes that are “theoretical” and then study try to track “theory” through these subsets.

SSHRC Stories and Success 2014

Today we had our annual celebration of SSHRC funded researchers, SSHRC Stories and Success 2014. I introduced the speakers and the theme.

Thank you Associate Vice-President Johnston. Good afternoon colleagues, it is my pleasure to introduce the theme for this year’s event, which is:

Emerging Technologies: Competing Needs and Challenges

I should begin by confessing that as I was preparing for this, I had one of those Emperor’s new clothes research moments when I realized I had no idea what really are the emerging technologies and no metric with which to evaluate my intuitions. It is easy to become convinced certain technologies one understands are emergent, but that doesn’t mean that there aren’t other more important trends or that one isn’t blinded by ones commitments.

Fortunately it turned out that the Wikipedia actually has a list of emerging technologies to keep me honest so I have chosen a few from that list as a way of introducing the theme and the researchers who will talk to it.

  • An area where there are a number of emerging technologies coming now to market is display technology and virtual reality. From consumer 3-D televisions that may or may not take off to Virtual Reality head sets like the Oculus Rift that was recently bought by Facebook for 2 billion – there is a lot of change in how we can watch on the horizon. I recently had a chance to try out the Occulus Rift development headset with content from Canadian research and design teams and it is a good news bad news story. The technology works and is solid – it won’t be long before it is brought to market, but there still is a nausea problem. Any of you who remember the VR excitement of 1990s will remember nausea was a problem then too. Maybe this is a re-emergining technology. More important than the technology, however, are the forms of engagement and immersion being imagined for the virtual and Patricia Boechler will be talking about The Third Dimension: Immersive Virtual Environments in Educational Research and Practice.
  • Of particular interest to us here in Alberta is a second category of emerging technologies and those are the emerging energy technologies and resource extraction technologies. The Intergovernmental Panel on Climate Change recently released their 5th assessment report titled “Climate Change 2014: Mitigation of Climate Change.” One of their conclusions for which there is high evidence and high agreement is that “deep cuts in emissions will require a diverse portfolio of policies, institutions, and technologies as well as changes in human behaviour and consumption patterns.” That sounds like a call for social science and humanities research and partnerships. The University of Alberta is one of the places where badly needed interdisciplinary research is emerging around the challenges of the interaction of technologies, policies and human behavior. The story of climate change and how we mitigate its effects is ours to study and change and today we have Gordon Gow who will talk about Stewarding Technology for Inclusive Innovation.
  • The third emerging technology I want to mention goes under the rubric of the Internet of Things. The idea is that soon we will be able to afford to embed networked computing in everyday appliances like your refrigerator and associated consumer products like the milk that goes into the fridge. Then the fridge could keep track of the milk and automagically tell you when you needed to buy more. The conveniences are endless – I could use my smartphone to tell if I had left the stove on or really locked the door every morning before I turn back to check. But there is another side to such a network of things. Bruce Sterling, a science fiction writer, has an essay on the Internet of Things that argues that the story of the Internet of Things has an overlooked history (smart appliances crop up regularly), and that this time the story of convenience is being harnessed for economic surveillance. He rightly points out that we are not the customers of companies like Google and Facebook – the customers who pay Facebook for a service are the advertisers and those who buy data about us – . If all our appliances are capable of transmitting even more data about us, who will gather that data, who will mine it, benefit from it and sell the analysis? Kevin Haggerty, our second speaker thinks about surveillance issues broadly going beyond the hi-tech concerns I have and he will be talking about Technologies of Nature: Surveillance at the Limits of the Human.

In closing I want to say a few words about how the social sciences and humanities are turning to think through emerging technologies. The Canadian science fiction writer William Gibson who coined the term cyberspace and helped us imagine that emerging technology in his 1984 novel Neuromancer has famously said that “The future is already here — it’s just not very evenly distributed.” Too often people outside the social sciences, arts and humanities think we are last to whom it is or should be distributed in the academy, but that is not the case historically nor today. If anything it is the social science, arts and humanities community that asks what technology is, how it emerges, how it is distributed, and how it can be used creatively. We are already deeply involved studying the emergence of technologies in the imagination and in use. We teach students to beware of the hype around technology and we teach them to use technologies creatively.  What is emergent is a multifaceted and interdisciplinary engagement in research and teaching with technologies and their very idea.