CBC TV: The Artists

The Artists is the story of the creators that were at the forefront of the early video game revolution. It explores the first three decades of video game history.

I just finished watching the CBC TV series, The Artists – Season 1. This is a series of short (9 – 12 minute) video essays that you can watch off the web. The series focuses on the issue of game designers as artists and starts and ends with an early and influential ad, We see farther, that EA (Electronic Arts) ran that showcased their developers, something other companies (like Atari) didn’t do.

The CBC series is well done, though I find the shorts too short. I wish they lingered a bit more over the clips of games and other historic materials. The kinetic style of the shorts may suit the medium, but not the history.

My other gripe is their choice of game designers to feature. There are no Japanese game designers. In fact, it is as if no one outside of the US and Canada designed games at all. They could have also covered some influential women designers like Brenda Laurel.

What is great, is episode 9 on Bioware (and Edmonton!) I didn’t realize that Greg Zeschuk, one of the founders of Bioware, started the Blind Enthusiasm Brewing Company which has a beer brewery and restaurant near my house.

Dan Hett’s game “c ya laterrrr”

c ya laterrrr is text “game” by Dan Hett to document his experience after the Manchester terror attack when he lost his brother. “c ya laterrrr” was the last message he got from his brother. I found the game through an interview with the Guardian that talks about the games he is making. Another games that is less narration and more 8-bit graphics is the Loss Levels made with Pico-8.

As both games deal with the same event they make an interesting comparison of genres. I find the text adventure game much more effective for this subject as you feel the event unfold and the decisions give you a feeling for the experience.

Google AI experiment has you talking to books

Google has announced some cool text projects. See Google AI experiment has you talking to books. One of them, Talk to Books, lets you ask questions or type statements and get answers that are passages from books. This strikes me as a useful research tool as it allows you to see some (book) references that might be useful for defining an issue. The project is somewhat similar to the Veliza tool that we built into Voyant. Veliza is given a particular text and then uses an Eliza-like algorithm to answer you with passages from the text. Needless to say, Talking to Books is far more sophisticated and is not based simply on word searches. Veliza, on the other hand can be reprogrammed and you can specify the text to converse with.

Continue reading Google AI experiment has you talking to books

The Ethics of Datafiction


Information Wants to Be Free, Or Does It? The Ethics of Datafication has just come out in the Electronic Book Review. This article was written with Bettina Berendt at KU Leuven and is about thinking about the ethics of digitization. The article first looks at the cliche phrase “information wants to be free” and then moves on to survey a number of arguments why some things should be digitized.

The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base

The Research Director for UpGuard, Chris Vickery (@VickerySec) has uncovered code repositories from AggregateIQ, the Canadian company that was building tools for/with SCL and Cambridge Analytica. See The Aggregate IQ Files, Part One: How a Political Engineering Firm Exposed Their Code Base and AggregateIQ Created Cambridge Analytica’s Election Software, and Here’s the Proof from Gizmodo.

The screenshots from the repository show on project called ephemeral with a description “Because there is no such thing as THE TRUTH”. The “Primary Data Storage” of Ephemeral is called “Mamba Jamba”, presumably a joke on “mumbo jumbo” which isn’t a good sign. What is mort interesting is the description (see image above) of the data storage system as “The Database of Truth”. Here is a selection of that description:

The Database of Truth is a database system that integrates, obtains, and normalizes data from disparate sources including starting with the RNC data trust.  … This system will be created to make decisions based upon the data source and quality as to which data constitutes the accepted truth and connect via integrations or API to the source systems to acquire and update this data on a regular basis.

A robust front-end system will be built that allows an authrized user to query the Database of Truth to find data for a particular upcoming project, to see how current the data is, and to take a segment of that data and move it to the Escrow Database System. …

The Database of Truth is the Core source of data for the entire system. …

One wonders if there is a philosophical theory, of sorts, in Ephemeral. A theory where no truth is built on the mumbo jumbo of a database of truth(s).

Ephemeral would seem to be part of Project Ripon, the system that Cambridge Analytica never really delivered to the Cruz campaign. Perhaps the system was so ephemeral that it never worked and therefore the Database of Truth never held THE TRUTH. Ripon might be better called Ripoff.

After the Facebook scandal it’s time to base the digital economy on public v private ownership of data

In a nutshell, instead of letting Facebook get away with charging us for its services or continuing to exploit our data for advertising, we must find a way to get companies like Facebook to pay for accessing our data – conceptualised, for the most part, as something we own in common, not as something we own as individuals.

Evgeny Morozov has a great essay in The Guardian on how After the Facebook scandal it’s time to base the digital economy on public v private ownership of data. He argues that better data protection is not enough. We need to “to articulate a truly decentralised, emancipatory politics, whereby the institutions of the state (from the national to the municipal level) will be deployed to recognise, create, and foster the creation of social rights to data.” In Alberta that may start with a centralized clinical information system called Connect Care managed by the Province. The Province will presumably control access to our data to those researchers and health-care practitioners that commit to using access appropriately. Can we imagine a model where Connect Care is expanded to include social data that we can then control and give others (businesses) access to?

Research Team Security

One of the researchers in the GamerGate Reactions team has created a fabulous set of recommendations for team members doing dangerous research. See Security_Recommendations_2018_v2.0. This document brings together in one place a lot of information and links on how to secure your identity and research. The researcher put this together in support of a panel that I am chairing this afternoon on Risky Research that is part of a day of panels/workshops following the Edward Snowden talk yesterday evening. (You can see my blog entry on Snowden’s talk here.) The key topics covered include:

  • Basic Security Measures
  • Use End-to-End Encryption for Communications  Encrypt Your Computer
  • Destroy All Information
  • Secure Browsing
  • Encrypt all Web Traffic
  • Avoiding Attacks
  • On Preventing Doxing
  • Dealing with Harassment

An Evening with Edward Snowden on Security, Public Life and Research

This evening we are hosting a video conferencing talk by Edward Snowden at the University of Alberta. These are some live notes taken during the talk for which I was one of the moderators. Like all live notes they will be full of misunderstandings.

Joseph Wiebe of Augustana College gave the introduction. Wiebe asked what is the place of cybersecurity in public life?

“What an incredible time?” is how Snowden started, talking about the Cambridge Analytica and Facebook story. Technology is changing and connecting across borders. We are in the midst of the greatest redistribution of power in the history of humankind without anyone being asked for their vote or opinion. Large platforms take advantage of our need for human connection and turn our desires into a weakness. They have perfected the most effective system of control.

The revelations of 2013 were never about just surveillance, they were about democracy. We feel something has been neglected in the news and in politics. It is the death of influence. It is a system of manipulation that robs us of power by a cadre of the unaccountable. It works because it is largely invisible and is all connected to the use and abuse of our data. We are talking about power that comes from information.

He told us to learn from the mistake of 5 years ago and not focus too much on surveillance, but to look beyond the lever to those putting their weight on it.

Back to the problem of illiberal technologies. Information and control is meant to be distributed among the people. Surveillance technology change has outstripped democratic institutions. Powerful institutions are trying to get as much control of these technologies as they can before their is a backlash. It will be very hard to take control back once everyone gets used to it.

Snowden talked about how Facebook was gathering all sorts of information from our phones. They (Facebook and Google) operate on our ignorance because there is no way we can keep up with changes in privacy policies. Governments are even worse with laws that allow mass surveillance.

There is an interesting interaction between governments with China modelling its surveillance laws on those of the US. Governments seem to experiment with clearly illegal technologies and the courts don’t do anything. Everything is secret so we can’t even know and make a decision.

What can we do when ordinary oversight breaks down and our checks and balances are bypassed. The public is left to rely on public resources like journalism and academia. We depend then public facts. Governments can manipulate those facts.

This is the tragedy of our times. We are being forced to rely on the press. This press is being captured and controlled and attacked. And how does the press know what is happening? They depend on whistleblowers who have no protection. Governments see the press as a threat.  Journalists rank in the hierarchy of danger between hackers and terrorists.

What sort of world will we face when governments figure out how to manage the press? What will we not know without the press.

One can argue that extraordinary times call for extraordinary measures, but who gets to decide? We don’t seem to have a voice even through our elected officials.

National security is a euphemism. We are witnessing the construction of a world where the most common political value is fear. Everyone argues we are living in danger and using that to control us. What is really happening is that morality has been replaced with legalisms. Rights have become a vulnerability.

Snowden disagrees. If we all disagree then things can change. Even in the face of real danger, there are limits to what should be allowed. Following Thoreau we need to resist. We don’t need a respect for the law, but for the right. The law is no substitute for justice or conscience.

Snowden would not be surprised if Facebook’s final defense is that “its legal.” But we need to ask if it is right. A wrong should not be turned into a right. We should be skeptical of those in power and the powers that shape our future. There times in history and in our lives when the only possible decision is to break the law.

More on Cambridge Analytica

More stories are coming out about Cambridge Analytica and the scraping of Facebook data. The Guardian has some important new articles:

Perhaps the most interesting article is in The Conversation and argues that Claims about Cambridge Analytica’s role in Africa should be taken with a pinch of saltThe article carefully sets out evidence that CA didn’t have the effect they were hired to have in either the Nigerian election (when they failed to get Goodluck Jonathan re-elected) or the Kenyan election where they may have helped Uhuru Kenyatta stay in power. The authors (Gabrielle Lynch, Justin Willis, and Nic Cheeseman) talk about how,

Ahead of the elections, and as part of a comparative research project on elections in Africa, we set up multiple profiles on Facebook to track social media and political adverts, and found no evidence that different messages were directed at different voters. Instead, a consistent negative line was pushed on all profiles, no matter what their background.

They also point out that the majority of Kenyans are not on Facebook and that negative advertising has a long history. They conclude that exaggerating what they can do is what CA does.

Mother Jones has another story, one of the best summaries around, Cloak and Data, that questions the effectiveness of Cambridge Analytica when it comes to the Trump election. They point out how CA’s work before in Virginia and for Cruz at the beginning of the primaries doesn’t seem to have worked. They go on to suggest that CA had little to do with the Trump victory which instead was ascribed by Parscale, the head of digital operations, to investing heavily in Facebook advertising.

During an interview with 60 Minutes last fall, Parscale dismissed the company’s psychographic methods: “I just don’t think it works.” Trump’s secret strategy, he said, wasn’t secret at all: The campaign went all-in on Facebook, making full use of the platform’s advertising tools. “Donald Trump won,” Parscale said, “but I think Facebook was the method.”

The irony may be that Cambridge Analytica is brought down by its boasting, not what it actually did. Further irony is how it may bring down Facebook and finally draw attention to how our data is used to manipulate us, even though it didn’t work.

The story of Cambridge Analytica’s rise—and its rapid fall—in some ways parallels the ascendance of the candidate it claims it helped elevate to the presidency. It reached the apex of American politics through a mix of bluffing, luck, failing upward, and—yes—psychological manipulation. Sound familiar?

More Conversation, Less Carbon

Today the Kule Institute for Advanced Study (KIAS) hosted a panel discussion on More Conferencing, Less Carbon. The discussion took place on site and online on your YouTube channel.

At this panel discussion Trevor Chow-Fraser of the Office of Sustainability announced the release of Moving Ideas Without Moving People a toolkit on running e-conferences at the University of Alberta. This toolkit was co-authored by Trevor Chow-Fraser, Chelsea Miya and Oliver Rossier and was based on the KIAS experience organizing our Around the World e-conferences.

What is at stake is the greening of research. We need to try and adapt different forms of video conferencing and live streaming to our conference/workshop needs in research. We need to depend less on F2F (face-to-face) conferences where everyone flies in. We need to confront the carbon costs of flights and how habituated we are to flying for research.