Adventures in Science Fiction Cover Art: Disembodied Brains, Part I | Science Fiction and Other Suspect Ruminations

Gerard Quinn’s cover for the December 1956 issue of New Worlds

Thanks to Ali I cam across this compilation of Adventures in Science Fiction Cover Art: Disembodied Brains. Joachim Boaz has assembled a number of pulp sci-fi cover art showing giant brains. The giant brain was often the way computing was imagined. In fact early computers were called giant brains.

Disembodied brains — in large metal womb-like containers, floating in space or levitating in the air (you know, implying PSYCHIC POWER), pulsating in glass chambers, planets with brain-like undulations, pasted in the sky (GOD!, surprise) above the Garden of Eden replete with mechanical contrivances among the flowers and butterflies and naked people… The possibilities are endless, and more often than not, taken in rather absurd directions.

I wonder if we can plot some of the early beliefs about computers through these images and stories of giant brains. What did we think the brain/mind was such that a big one would have exaggerated powers? The equation would go something like this:

  • A brain is the seat of intelligence
  • The bigger the brain, the more intelligent
  • In big brains we might see emergent properties (like telepathy)
  • Scaling up the brain will give us artificially effective intelligence

This is what science fiction does so well – it takes some aspect of current science or culture and scales it up to imagine the consequences. Scaling brains, however, seems a bit literal, but the imagined futures are nonetheless important.

Covid-19 Notice on YouTube

COVID-19 Popup Notice on YouTube

When you go to YouTube now in Canada, a notice from the Public Health Agency of Canada pops up inviting you to Learn More from a reliable source. This strikes me a great way to encourage people to get their information from a reliable source rather than wallow in fake news online. This is particularly true of YouTube that is one of the facilitators of fake news.

More generally it shows an alternative way that social media platforms can fight fake news on key issues. They can work with governments to put appropriate information before people.

Further, the Learn More links to a government site with a wealth of information and links. Had it just been a short feel good message with a bit of advice, the site probably wouldn’t work to distract people towards reliable information. Instead the site has enough depth that one could spend a lot of time and get a satisfying picture. This is what one needs to fight fake news in a time of obsession – plenty of true news for the obsessed.

Here’s the File Clearview AI Has Been Keeping on Me, and Probably on You Too – VICE

We used the California Consumer Privacy Act to see what information the controversial facial recognition company has collected on me.

Anna Merlan has an important story on Vice, Here’s the File Clearview AI Has Been Keeping on Me, and Probably on You Too (Feb. 28, 2020). She used the California privacy laws to ask Clearview AI what information they kept on her and then to delete it. They asked her for a photo and proof of identification and eventually sent her a set of images and an index of where they came from. What is interesting is that they aren’t just scraping social media, they are scraping other scrapers like Insta Stalkers and various right wing sources that presumably have photos and stories about “dangerous intellectuals” like Merlan.

This bring back up the question of what is so unethical about face recognition and the storage of biometrics. We all have pictures of people in our photo collections, and Clearview AI was scraping public photos – is it then the use of the images that is the problem? Is it the recognition and search capability.

When Coronavirus Quarantine Is Class Warfare

A pandemic offers a great way to examine American class inequities.

There have been a couple of important stories about the quarantine as symbolic of our emerging class structure. The New York Times has an opinion by Charlie Warzen on When Coronavirus Quarantine Is Class Warfare(March 6th, 2020)

That pleasantness is heavily underwritten by a “vast digital underclass.” Many services that allow you to stay at home work only when others have to be out in the world on your behalf.

The quarantine shows how many services we have available for those who do intellectual work that can be done online. It is as if we were planning to be quarantined for years. The quarantine shows how one class can isolate themselves, but at the expense of a different class that handles all the inconveniences of material stuff and physical encounters of living. We have the permanent jobs with benefits. They deal with delivering food and trash. We can isolate ourselves from diseases, they have to risk disease to work. The gig economy has expanded the class of precarious workers that support the rest of us.

Continue reading When Coronavirus Quarantine Is Class Warfare

The Prodigal Techbro

The journey feels fake. These ‘I was lost but now I’m found, please come to my TED talk’ accounts typically miss most of the actual journey, yet claim the moral authority of one who’s ‘been there’ but came back. It’s a teleportation machine, but for ethics.

Source:

Maria Farrell, a technology policy critic, has written a nice essay on The Prodigal Techbro. She sympathizes with technology bros who have changed their mind, in the sense of wishing them well, but feels that they shouldn’t get so much attention. Instead we need to care for those who were critics from the beginning and who really need the attention and care. She maps this onto the parable of the Prodigal Son; why does the son who was lost get all the attention? She makes it an ethical issue, which is interesting, one I imagine fitting an ethics of care.

She ends the essay with this advice to techies who are changing their mind:

So, if you’re a prodigal tech bro, do us all a favour and, as Rebecca Solnit says, help “turn down the volume a little on the people who always got heard”:

  • Do the reading and do the work. Familiarize yourself with the research and what we’ve already tried, on your own time. Go join the digital rights and inequality-focused organizations that have been working to limit the harms of your previous employers and – this is key – sit quietly at the back and listen.
  • Use your privilege and status and the 80 percent of your network that’s still talking to you to big up activists who have been in the trenches for years already—especially women and people of colour. Say ‘thanks but no thanks’ to that invitation and pass it along to someone who’s done the work and paid the price.
  • Understand that if you are doing this for the next phase of your career, you are doing it wrong. If you are doing this to explain away the increasingly toxic names on your resumé, you are doing it wrong. If you are doing it because you want to ‘give back,’ you are doing it wrong.

Do this only because you recognize and can say out loud that you are not ‘giving back’, you are making amends for having already taken far, far too much.

Reflecting on one very, very strange year at Uber

As most of you know, I left Uber in December and joined Stripe in January. I’ve gotten a lot of questions over the past couple of months about why I left and what my time at Uber was like. It’s a strange, fascinating, and slightly horrifying story that deserves to be told while it is still fresh in my mind, so here we go.

The New York Times has a short review of Susan Fowler’s memoir, Her Blog Post About Uber Upended Big Tech. Now She’s Written a Memoir. Susan Fowler is the courageous engineer who documented the sexism at Uber in a blog post, Reflecting on one very, very strange year at Uber — Susan Fowler. Her blog post from 2017 (the opening of which is quoted above) was important in that drew attention to the bro culture in Silicon Valley. It also led to investigations within Uber and eventually to the co-founder and CEO Travis Kalanick being ousted.

Continue reading Reflecting on one very, very strange year at Uber

It’s the (Democracy-Poisoning) Golden Age of Free Speech

And sure, it is a golden age of free speech—if you can believe your lying eyes. Is that footage you’re watching real? Was it really filmed where and when it says it was? Is it being shared by alt-right trolls or a swarm of Russian bots? Was it maybe even generated with the help of artificial intelligence?

There have been a number of stories bemoaning what has become of free speech. Fore example, WIRED has one title, It’s the (Democracy-Poisoning) Golden Age of Free Speech by Zeynep Tufekci (Jan. 16, 2020). In it she argues that access to an audience for your speech is no longer a matter of getting into centralized media, it is now a matter of getting attention. The world’s attention is managed by a very small number of platforms (Facebook, Google and Twitter) using algorithms that maximize their profits by keeping us engaged so they can sell our attention for targeted ads.

Continue reading It’s the (Democracy-Poisoning) Golden Age of Free Speech

Eyal Weizman: The algorithm is watching you

The London Review of Books has a blog entry by Eyal Weizman on how The algorithm is watching you (Feb. 19, 2020). Eyal Weizman, the founding director of Forensic Architecture, writes that he was denied entry into the USA because an algorithm had identified a security issue. He was going to the US for a show in Miami titled True to Scale.

Setting aside the issue of how the US government seems to now be denying entry to people who do inconvenient investigations, something a country that prides itself on human rights shouldn’t do, the use of an algorithm as a reason is disturbing for a number of reasons:

  • As Weizman tells the story, the embassy officer couldn’t tell what triggered the algorithm. That would seem to violate important principles in the use of AIs; namely that an AI used in making decisions should be transparent and able to explain why it made the decision. Perhaps the agency involved doesn’t want to reveal the nasty logic behind their algorithms.
  • Further, there is no recourse, another violation of principle for AIs, namely that they should be accountable and there should be mechanisms to challenge a decision.
  • The officer then asked Weizman to provide them with more information, like his travel for the last 15 years and contacts, which he understandably declined to do. In effect the system was asking him to surveil himself and share that with a foreign government. Are we going to be put in the situation where we have to surrender privacy in order to get access to government services? We do that already for commercial services.
  • As Weizman points out, this shows the “arbitrary logic of the border” that is imposed on migrants. Borders have become grey zones where the laws inside a country don’t apply and the worst of a nation’s phobias are manifest.

PETER ROCKWELL Obituary

ROCKWELL, Peter Barstow Sculptor, Scholar and Teacher, dies at 83 Died peacefully on February 6, 2020 in Danvers, MA.

My father passed away last Thursday, Feb. 6th. I’ve been gathering information and writing a short and longer obituary. I’ve also been going through my father’s email writing people he was in touch with. In a strange way I feel I am rolling up his life.

My sister posted an obituary in the Boston Globe: PETER ROCKWELL Obituary – Boston, MA | Boston Globe. Interestingly the Globe ran their own short article Peter Rockwell, a sculptor and a son of Norman Rockwell, dies at 83.

What is touching are all the heartfelt condolences coming in from students, friends and colleagues that enjoyed his company and work.

Show and Tell at CHRIN


Stéphane Pouyllau’s photo of me presenting

Michael Sinatra invited me to a “show and tell” workshop at the new Université de Montréal campus where they have a long data wall. Sinatra is the Director of CRIHN (Centre de recherche interuniversitaire sur les humanitiés numériques) and kindly invited me to show what I am doing with Stéfan Sinclair and to see what others at CRIHN and in France are doing.

Continue reading Show and Tell at CHRIN