I finally finished watching the BBC documentary series Can’t Get You Out of My Head by Adam Curtis. It is hard to describe this series which is cut entirely from archival footage with Curtis’ voice interpreting and linking the diverse clips. The subtitle is “An Emotional History of the Modern World” which is true in that the clips are often strangely affecting, but doesn’t convey the broad social-political connections Curtis makes in the narration. He is trying out a set of theses about recent history in China, the US, the UK, and Russia leading up to Brexit and Trump. I’m still digesting the 6 part series, but here are some of the threads of theses:
Conspiracies. He traces our fascination and now belief in conspiracies back to a memo by Jim Garrison in 1967 about the JFK assassination. The memo, Time and Propinquity: Factors in Phase I presents results of an investigative technique built on finding patterns of linkages between fragments of information. When you find strange coincidences you then weave a story (conspiracy) to join them rather than starting with a theory and checking the facts. This reminds me of what software like Palantir does – it makes (often coincidental) connections easy to find so you can tell stories. Curtis later follows the evolution of conspiracies as a political force leading to liberal conspiracies about Trump (that he was a Russian agent) and alt-right conspiracies like Q-Anon. We are all willing to surrender our independence of thought for the joys of conspiracies.
Big Data Surveillance and AI. Curtis connects this new mode of investigation to what the big data platforms like Google now do with AI. They gather lots of fragments of information about us and then a) use it to train AIs, and b) sell inferences drawn from the data to advertisers while keeping us anxious through the promotion of emotional content. Big data can deal with the complexity of the world which we have given up on trying to control. It promises to manage the complexity of fragments by finding patterns in them. This reminds me of discussions around the End of Theory and shift from theories to correlations.
Psychology. Curtis also connects this to emerging psychological theories about how our minds may be fragmented with different unconscious urges moving us. Psychology then offers ways to figure out what people really want and to nudge or prime them. This is what Cambridge Analytica promised – the ability to offer services we believed due to conspiracy theories. Curtis argues at the end that behavioural psychology can’t replicate many of the experiments undergirding nudging. Curtis suggests that all this big data manipulation doesn’t work though the platforms can heighten our anxiety and emotional stress. A particularly disturbing part of the last part is the discussion of how the US developed “enhanced” torture techniques based on these ideas after 9/11 to create “learned helplessness” in prisoners. The idea was to fragment their consciousness so that they would release a flood of these fragments, some of which might be useful intelligence.
Individualism. A major theme is the rise of individualism since the war and how individuals are controlled. China’s social credit model of explicit control through surveillance is contrasted to the Western consumer driven platform surveillance control. Either way, Curtis’ conclusion seems to be that we need to regain confidence in our own individual powers to choose our future and strive for it. We need to stop letting others control us with fear or distract us with consumption. We need to choose our future.
In some ways the series is a plea for everyone to make up their own stories from their fragmentary experience. The series starts with this quote,
The ultimate hidden truth of the world is that it is something we make, and could just as easily make differently. (David Graeber)
Of course, Curtis’ series could just be a conspiracy story that he wove out of the fragments he found in the BBC archives.
I recently finished listening to James Gleick’s Time Travel: A History. Gleick wrote the best book on The Information there is and this book is almost as good. He weaves the science together with the fictions about time travel starting with H.G. Wells’ The Time Machine and using that to then look at how science started treating time as a dimension that they allowed us to seriously talk about traveling on that dimension. It is historical ontology done really well.
Near the end he talks about the brilliant Doctor Who episode Blink (Doctor Who) with Carey Mulligan where she has a conversation with Doctor Who (Tennant) mediated by Easter Eggs on DVDs and transcribed onto paper. That transcription she hands to the Doctor at the end of the episode so he can put the video onto the DVDs in the past for her to talk to. It is brilliant.
Part of what I like about Gleick is he shows the connections between science and how we imagine ideas like time through literature and film. He ends by suggesting that we have time travel in our stories and imagination.
It might be fair to say that all we perceive is change—that any sense of stasis is a constructed illusion. Every moment alters what came before. We reach across layers of time for the memories of our memories.
“Live in the now,” certain sages advise. They mean: focus; immerse yourself in your sensory experience; bask in the incoming sunshine, without the shadows of regret or expectation. But why should we toss away our hard-won insight into time’s possibilities and paradoxes? We lose ourselves that way. (Gleick, James. Time Travel, p. 308)
Mitsuhiro Asakawa finally convinced Tsuge and his son to let the work be translated into English. Mitsuhiro is the unsung hero of Japanese comics translation. He’s the guy who has written the most about the Garo era, he’s the go-to guy to connect with these great authors and their families. Most of the collections D+Q have done wouldn’t exist without his help.
Part of what is interesting in the essay is how Salvador documents the different views about what such simulations were good for. SimRefinery was not a accurate simulation that would cover the complexity of the chemical engineering of a refinery; so what was it good for. Chevron apparently wanted something to help the staff who weren’t engineers to understand some of the connectiveness of a refinery – how problems in one area could impact others. Will Wright, the genius behind Maxis, didn’t think serious simulations were possible or something they wanted to do. He saw SimCity as a caricature that was fun. At best it might give people a “mental model” of the issues around city management. It was for that reason that MBS was a spin-off designed to contract with businesses that felt serious simulations were feasible and useful.
The era of peak globalisation is over. For those of us not on the front line, clearing the mind and thinking how to live in an altered world is the task at hand.
John Gray has written an essay in the New Statesman on Why this crisis is a turning point in history. He argues that the era of hyperglobalism is at an end and many systems may not survive the shift to something different. Many may think we will, after a bit of isolated pain, return to the good old expanding wealth, but the economic crisis that is now emerging may break that dream. Governments and nations may be broken by collapsing systems.
The academic study of literature is no longer on the verge of field collapse. It’s in the midst of it. Preliminary data suggest that hiring is at an all-time low. Entire subfields (modernism, Victorian poetry) have essentially ceased to exist. In some years, top-tier departments are failing to place a single student in a tenure-track job.
The Chronicle Review has released a free collection on Endgame: Can Literary Studies Survive (PDF)Endgame is a collection of short essays about the collapse of literary studies in the US. The same is probably true of the other fields in the interpretative humanities and social sciences. This collection gives a human face to the important (and depressing) article Benjamin Schmidt wrote in The Atlantic about the decline in humanities majors since 2008, The Humanities Are In Crisis.
The Computer Literacy Project, on the other hand, is what a bunch of producers and civil servants at the BBC thought would be the best way to educate the nation about computing. I admit that it is a bit elitist to suggest we should laud this group of people for teaching the masses what they were incapable of seeking out on their own. But I can’t help but think they got it right. Lots of people first learned about computing using a BBC Micro, and many of these people went on to become successful software developers or game designers.
I’ve just discovered Two-Bit History (0b10), a series of long and thorough blog essays on the history of computing by Sinclair Target. One essay is on Codecademy vs. The BBC Micro. The essay gives the background of the BBC Computer Literacy Project that led the BBC to commission as suitable microcomputer, the BBC Micro. He uses this history to then compare the way the BBC literacy project taught a nation (the UK) computing to the way the Codeacademy does now. The BBC project comes out better as it doesn’t drop immediately into drop into programming without explaining, something the Codecademy does.
I should add that the early 1980s was a period when many constituencies developed their own computer systems, not just the BBC. In Ontario the Ministry of Education launched a process that led to the ICON which was used in Ontario schools in the mid to late 1980s.
50 years ago on October 29th, 1969 was when the first two nodes of the ARPANET are supposed to have connected. There are, of course, all sorts of caveats, but it seems to have been one of the first times someone remote log in from one location to another on what became the internet. Gizmodo has an interview with Bradley Fidler on the history that is worth reading.
Remote access was one of the reasons the internet was funded by the US government. They didn’t want to give everyone their own computer. Instead the internet (ARPANET) would let people use the computers of others remotely (See Hafner & Lyon 1996).
Like many, I learned to program multimedia in HyperCard. I even ended up teaching it to faculty and teachers at the University of Toronto. It was a great starting development environment with a mix of graphical tools, hypertext tools and a verbose programming language. It’s only (and major) flaw was that it wasn’t designed to create networked information. HyperCard Stacks has to be passed around on disks. The web made possible a networked hypertext environment that solved the distribution problems of the 1980s. One wonders why Apple (or someone else) doesn’t bring it back in an updated and networked form. I guess that is what the Internet Archive is doing.