Watch Andy Warhol “Paint” On A Commodore Computer: Gothamist

Eric Hayot at the Novel Worlds conference showed a slide with an image of Debbie Harry of Blondie painted on the Amiga by Andy Warhol. There is a video of Warhol painting on the Amiga at the premiere of the Commodore Amiga.

This is discussed in a documentary The Invisible Photograph: Part 2 (Trapped). The documentary also talks about recovering other images from Warhol’s original Amiga that was preserved by the The Andy Warhol Museum.

Technologizer has a nice retrospective on the Amiga, Amiga: 25 Years Later. I remember when it came out in 1985. I had a Mac by then, but was intrigued by the colour Amiga and the video work people were doing with it.

CSDH and CGSA 2018

This year we had busy CSDH and CGSA meetings at Congress 2018 in Regina. My conference notes are here. Some of the papers I was involved in include:

CSDH-SCHN:

  • “Code Notebooks: New Tools for Digital Humanists” was presented by Kynan Ly and made the case for notebook-style programming in the digital humanities.
  • “Absorbing DiRT: Tool Discovery in the Digital Age” was presented by Kaitlyn Grant. The paper made the case for tool discovery registries and explained the merger of DiRT and TAPoR.
  • “Splendid Isolation: Big Data, Correspondence Analysis and Visualization in France” was presented by me. The paper talked about FRANTEXT and correspondence analysis in France in the 1970s and 1980s. I made the case that the French were doing big data and text mining long before we were in the Anglophone world.
  • “TATR: Using Content Analysis to Study Twitter Data” was a poster presented by Kynan Ly, Robert Budac, Jason Bradshaw and Anthony Owino. It showed IPython notebooks for analyzing Twitter data.
  • “Climate Change and Academia – Joint Panel with ESAC” was a panel I was on that focused on alternatives to flying for academics.

CGSA:

  • “Archiving an Untold History” was presented by Greg Whistance-Smith. He talked about our project to archive John Szczepaniak’s collection of interviews with Japanese game designers.
  • “Using Salience to Study Twitter Corpora” was presented by Robert Budac who talked about different algorithms for finding salient words in a Twitter corpus.
  • “Political Mobilization in the GG Community” was presented by ZP who talked about a study of a Twitter corpus that looked at the politics of the community.

Also, a PhD student I’m supervising, Sonja Sapach, won the CSDH-SCHN (Canadian Society for Digital Humanities) Ian Lancashire Award for Graduate Student Promise at CSDHSCHN18 at Congress. The Award “recognizes an outstanding presentation at our annual conference of original research in DH by a graduate student.” She won the award for a paper on “Tagging my Tears and Fears: Text-Mining the Autoethnography.” She is completing an interdisciplinary PhD in Sociology and Digital Humanities. Bravo Sonja!

Too Much Information and the KWIC

A paper that Stéfan Sinclair and wrote about Peter Luhn and the Keyword-in-Context (KWIC) has just been published by the Fudan Journal of the Humanities and Social Sciences, Too Much Information and the KWIC | SpringerLink. The paper is part of a series that replicates important innovations in text technology, in this case, the development of the KWIC by Peter Luhn at IBM. We use that as a moment to reflect on the datafication of knowledge after WW II, drawing on Lyotard.

An Evening with Edward Snowden on Security, Public Life and Research

This evening we are hosting a video conferencing talk by Edward Snowden at the University of Alberta. These are some live notes taken during the talk for which I was one of the moderators. Like all live notes they will be full of misunderstandings.

Joseph Wiebe of Augustana College gave the introduction. Wiebe asked what is the place of cybersecurity in public life?

“What an incredible time?” is how Snowden started, talking about the Cambridge Analytica and Facebook story. Technology is changing and connecting across borders. We are in the midst of the greatest redistribution of power in the history of humankind without anyone being asked for their vote or opinion. Large platforms take advantage of our need for human connection and turn our desires into a weakness. They have perfected the most effective system of control.

The revelations of 2013 were never about just surveillance, they were about democracy. We feel something has been neglected in the news and in politics. It is the death of influence. It is a system of manipulation that robs us of power by a cadre of the unaccountable. It works because it is largely invisible and is all connected to the use and abuse of our data. We are talking about power that comes from information.

He told us to learn from the mistake of 5 years ago and not focus too much on surveillance, but to look beyond the lever to those putting their weight on it.

Back to the problem of illiberal technologies. Information and control is meant to be distributed among the people. Surveillance technology change has outstripped democratic institutions. Powerful institutions are trying to get as much control of these technologies as they can before their is a backlash. It will be very hard to take control back once everyone gets used to it.

Snowden talked about how Facebook was gathering all sorts of information from our phones. They (Facebook and Google) operate on our ignorance because there is no way we can keep up with changes in privacy policies. Governments are even worse with laws that allow mass surveillance.

There is an interesting interaction between governments with China modelling its surveillance laws on those of the US. Governments seem to experiment with clearly illegal technologies and the courts don’t do anything. Everything is secret so we can’t even know and make a decision.

What can we do when ordinary oversight breaks down and our checks and balances are bypassed. The public is left to rely on public resources like journalism and academia. We depend then public facts. Governments can manipulate those facts.

This is the tragedy of our times. We are being forced to rely on the press. This press is being captured and controlled and attacked. And how does the press know what is happening? They depend on whistleblowers who have no protection. Governments see the press as a threat.  Journalists rank in the hierarchy of danger between hackers and terrorists.

What sort of world will we face when governments figure out how to manage the press? What will we not know without the press.

One can argue that extraordinary times call for extraordinary measures, but who gets to decide? We don’t seem to have a voice even through our elected officials.

National security is a euphemism. We are witnessing the construction of a world where the most common political value is fear. Everyone argues we are living in danger and using that to control us. What is really happening is that morality has been replaced with legalisms. Rights have become a vulnerability.

Snowden disagrees. If we all disagree then things can change. Even in the face of real danger, there are limits to what should be allowed. Following Thoreau we need to resist. We don’t need a respect for the law, but for the right. The law is no substitute for justice or conscience.

Snowden would not be surprised if Facebook’s final defense is that “its legal.” But we need to ask if it is right. A wrong should not be turned into a right. We should be skeptical of those in power and the powers that shape our future. There times in history and in our lives when the only possible decision is to break the law.

Cooking Up Literature: Talk at U of South Florida

Last week I presented a paper based on work that Stéfan Sinclair and I are doing at the University of South Florida. The talk, titled, “Cooking Up Literature: Theorizing Statistical Approaches to Texts” looked at a neglected period of French innovation in the 1970s and 1980s. During this period the French were developing a national corpus, FRANTEXT, while there was also a developing school of exploratory statistics around Jean-Paul Benzécri. While Anglophone humanities computing was concerned with hypertext, the French were looking at using statistical methods like correspondence analysis to explore large corpora. This is long before Moretti and “distant reading.”

The talk was organized by Steven Jones who holds the DeBartolo Chair in Liberal Arts and is a Professor of Digital Humanities. Steven Jones leads a NEH funded project called RECALL that Stéfan and I are consulting on. Jones and colleagues at USF are creating a 3D model of Father Busa’s original factory/laboratory.

Social networks are creating a global crisis of democracy

[N]etworks themselves offer ways in which bad actors – and not only the Russian government – can undermine democracy by disseminating fake news and extreme views. “These social platforms are all invented by very liberal people on the west and east coasts,” said Brad Parscale, Mr. Trump’s digital-media director, in an interview last year. “And we figure out how to use it to push conservative values. I don’t think they thought that would ever happen.” Too right.

The Globe and Mail this weekend had an essay by Niall Ferguson on how Social networks are creating a global crisis of democracy. The article is based on Ferguson’s new book The Square and the Tower: Networks and Power from the Freemasons to Facebook. The article points out that manipulation is not just an American problem, but also points out that the real problem is our dependence on social networks in the first place.

Continue reading Social networks are creating a global crisis of democracy

Are Algorithms Building the New Infrastructure of Racism?

Robert Moses

3quarksdaily, one of the better web sites for extracts of interesting essays, pointed me to this essay on Are Algorithms Building the New Infrastructure of Racism? in Nautilus by Aaron M. Bornstein (Dec. 21, 2017). The article reviews some of the terrain covered by Cathy O’Neil’s book Weapons of Math Destruction, but the article also points out how AIs are becoming infrastructure and infrastructure with bias baked in is very hard to change, like the low bridges that Robert Moses built to make it hard for public transit to make it into certain areas of NYC. Algorithmic decisions that are biased and visible can be studied and corrected. Decisions that get built into infrastructure disappear and get much harder to fix.

a fundamental question in algorithmic fairness is the degree to which algorithms can be made to understand the social and historical context of the data they use …

Just as important is paying attention to the data that is used to train the AIs in the first place. Historic data carries the biases of these generations and they need to be questioned as they get woven into our infrastructure.

Plato’s Virtual Reality

From a Humanist note I came across the fine essay on virtual reality, The Promise and Disappointment of Virtual Reality. It starts and ends with Plato’s cave and the responsibility of those freed from the cave to go back in and help others. Alas the state of VR technology doesn’t yet seem good enough to free us from reality and in this case the reality of VR is the commercialism of it.

But Plato’s Cave presupposes that those freeing the prisoner from their chains to reveal the true nature of “reality” are altruistic in their intent—that the world being shown the freed prisoners is indeed the truth. It is an allegory that does not allow for the world as it is today, or the pervasive desire to escape it.

The continued commercial failure of VR may represent an unconscious resistance to jettisoning our connection to the real. Maybe we are waiting for that blockbuster game to drive mass-market appeal. Perhaps the technology simply is not good enough yet to simulate a truly authentic—and profitable—experience. In this sense we are trapped. We crave authenticity of experience but, despite the efforts of philosophers, authors and auteurs, our imaginations appear limited to what we can individually consume and identify with. While capitalism lumbers on, we cannot see anything but the shadows on the wall.

What is nice about this essay by Mark Riboldi is the tour of the history of virtual reality technologies and dreams. What he doesn’t talk about is the sense of disappointment when the first generation of VR didn’t live up to the hype. I remember in the 1990s believing in VR (and lecturing on it.) When it proved clunky and nausea-inducing I felt let down by technology. Perhaps I and others had dreamed too much into VR led on by novels like Neuromancer. I was convinced VR was the logical next thing after the GUI. We had gone from a one-dimensional calligraphic screen to a two-dimensional desktop … wasn’t the three-dimensional virtual world next?

It is also worth mentioning that there have been a number of people writing about gender differences in how VR technology affects us. See Closing the Gender Gap in Virtual Reality. The technology seems to have been designed for men and calibrated to the male experience of reality.

Alice and Bob: the World’s Most Famous Cryptocouple

Alice and Bob is a web site and paper by Quinn DuPont and Alana Cattapan that nicely tells the history of the famous virtual couple used to explain cryptology.

While Alice, Bob, and their extended family were originally used to explain how public key cryptography works, they have since become widely used across other science and engineering domains. Their influence continues to grow outside of academia as well: Alice and Bob are now a part of geek lore, and subject to narratives and visual depictions that combine pedagogy with in-jokes, often reflecting of the sexist and heteronormative environments in which they were born and continue to be used. More than just the world’s most famous cryptographic couple, Alice and Bob have become an archetype of digital exchange, and a lens through which to view broader digital culture.

The web site provides a timeline going back to 1978. The history is then explained more fully in the full paper (PDF). They end by talking about the gendered history of cryptography. They mention other examples where images of women serve as standard test images like the image of Lena from Playboy.

The design of the site nicely shows how a paper can be remediated as an interactive web site. It isn’t that fancy, but you can navigate the timeline and follow links to get a sense of this “couple”.

What It’s Like to Use an Original Macintosh in 2017 – The Atlantic

The Internet Archive’s new software emulator will take you back to 1984.

From Twitter again (channelled from Justin Trudeau) is a story in the Atlantic about the Internet Archive’s early Macintosh emulatorWhat It’s Like to Use an Original Macintosh in 2017. The emulator comes with a curated set of apps and games, including Dark Castle, which I remember my mother liking. (I was more fond of Déjà Vu.) Here is what MacPaint 2.0 looked like back then.

I’m amazed they can emulate the Mac OS in JavaScript. I’m also amazed at the community of people coming together to share old Mac software, manuals, and books with the IA.