The Mind Tool: Edward Vanhoutte’s Blog

Edware Vanhoutte, who has done some of the best work on the history of humanities computing (though much is not yet published), has started a blog. In his first entry, The Mind Tool: Edward Vanhoutte’s Blog, he summarizes early text books that were used to teach humanities computing. It would be interesting to look at how these 70s and 80s books conceive of the computer and how they differ from the 50s and 60s work like that of Booth.

Recipes and Generative Codes

Screen shot from web siteChristopher Alexander of Pattern Language fame has developed a second generation of pattern language that focuses on process rather than outcome.

A generative sequence may be thought of as a second generation pattern language. (From the Pattern Language Website)

Reading around the web sites I am struck by how Alexander has woven computing into his second generation ideas. Could it be that Alexander was influenced by the way computer scientists responded to his pattern language ideas?

Now a sequence is something that looks very very simple and is actually very very difficult. It’s more than a pattern; it’s an algorithm about process. But what is possible is to write sequences so that they are easy. You follow the steps in a sequence like you follow the steps in a cooking recipe. (From A Just So Story)

Reading A Just So Story (subtitle “How got its name”) suggests to me that the way the computing community took to his ideas led Alexander to think about processes and code. In his The Origins of Pattern Theory) 1996 address to the OOPSLA he calls the overlap of ideas “a deeper coincidence in what you are doing in software design and what I am doing in architectural design”.

It is also worth noting how Alexander describes generative sequences as recipes (which we have been using to help people understand text analysis):

After all, every recipe is a sequence of steps. Is a generative sequence anything more than a series of steps like a recipe for cake or omlets. (From Uniqueness of Generative Sequences)

Elsewhere he talks about unfolding and recipes synonymously.

I think there is an interesting thread to pursue through the criticism of Alexander’s sometimes naive mysticism while also experimenting with its application to methods in the textual disciplines. Patterns and recipes are evocative and useful, I’m not sure I buy the Heideggarian philosophy the Alexander thinks they are grounded in.

The web materials under are, to make matters worse, confusing to browse. (They are under “method” to begin.) Alexander has another site, Building Living Neighborhoods which is much better organized and is aimed at the neighborhood activist. It illustrates what he is talking about better than the home site.

Book Cove4rI need to say something about Alexander’s site and books. They are poorly designed and undermine his message. If he followed a process for developing his web site I would call it the “use frames when you don’t need them, use tables within tables so that people can’t help but see them, and randomly add things when you think of them.” Consistent navigation or design is not a priority. The second series of books The Nature of Order, published by The Center for Environmental Structure, also suffers. At CAD $100 a book it feels cheap and the images reproduced often look like they were scanned from newspapers. If you look at the nested boxes on the cover (click on image) of his second series of books you can see how attached he is to coloured tables and boxes. If you look closely at the cover you can also see how sloppy it is. I wish Alexander and The Centre would practice in web and book design what they preach for architecture. These ideas are too important (and too close to mysticism) to be tainted by cheap and amateur design.

As Alexander puts it in a strange misspelling about the shift from patterns to sequences,

In fact, both A Pattern Language and The Timeless Way Of Building say that the pattern language is to be used sequentially. In practice, however, this feature dropped out of site, and was not emphasized in use. (From The Origins of Pattern Theory)

Did it drop out of “sight” or out of the “site”?

The Walrus » Driven to Distraction

The Walrus has a story, Driven to Distraction by John Lornic (April 2007) about new research showing how always-on communications technology is distracting us. The article references a research site with links to papers on interruption, which seems to be emerging as a focus for cognitive studies and HCI. See, for example the reports by Daniel McFarlane who defines interruption thus,

Human interruption is the process of coordinating abrupt change in people’s activities. (Interruption of People in Human-Computer
Interaction: A General Unifying Definition
of Human Interruption and Taxonomy
, PDF)

Interruptions are not necessarily bad, and there is evidence (the Zeigarnik Effect) that we are better at remembering incomplete or interrupted tasks better than finished one, but when interruptions become the normal state we lose the ability to finish anything and become addicted to perpetual interruption. Interruption is the fundamental possibility of interactivity, it is the cutting into the continuum that surprises us.

Bemer and the History of Computing

The History of Computing Project is another collection of timelines and biographies sponsored by computer museums in Holland, Poland and elsewhere. There are some gaps, like the empty biography of Bill Atkinson and a history of Apple that is “withdrawn for revision”. It is, however, cleanly designed, and covers a lot.

Some of the information is useful like the biography of Bob Bemer who contributed the ESCape key and worked on ASCII, among other things, at IBM. (See CNN – 1963: The debut of ASCII – July 6, 1999 or the archive of Bob Bemer‘s personal site – he has passed away
.) Thanks to Matt for this.

RAMAC and Interactivity: Pictorial History of Media Technology

IBM 305 RAMACPictorial History of Media Technology is a slide show history of computing and media, especially video technology. It is on a site dedicated to “Capacitance Electronic Discs or CED’s, a consumer video format on grooved vinyl discs that was marketed by RCA in the 1980’s.” The slide show has pictures of the IBM 305 RAMAC Computer with what was the first disk drive in production. What’s so important about the RAMAC?

Matthew G. Kirschenbaum in a blog entry on An Excerpt from Mechanisms, Professor RAMAC and in an article for Text Technology, Extreme Inscription: Towards a Grammatology of the Hard Drive, argues that,

Magnetic disk media, more specifically the hard disk drive, was to become that technology and, as much as bitmapped-GUIs and the mouse, usher in a new era of interactive, real-time computing.

Krischenbaum is right that interactivity wouldn’t be possible without random access memory and he takes this in an interesting direction around inscription. I look forward to his book.

High Performance Computing

What is high performance computing?

On Wednesday I was at a meeting to discuss the National Platforms program which is part of the new CFI programs. Here are the details or the proposed program:

National Platforms Fund (NPF)

The National Platforms Fund provides generic research infrastructure, resources, services, and facilities that serve the needs of many research subjects and disciplines, and that require periodic reinvestments because of the nature of the technologies. The Fund is established to deal first with High Performance Computing, but may be applicable in other cases.

Working with the HPC folks raises interesting questions about what HPC is and whether it has applications in the Humanities.
Continue reading High Performance Computing

Mahoney on the History of Theory in Computer Science

I’m reading an article by Michael S. Mahoney in The First Computers. It is titled “The Structures of Computation” and Mahoney (who is one of the best historians of computing I have read – see my previous entry, History of Computing) makes a closing point,

The history of science has until recently tended to ignore the role of technology in scientific thought, … The situation has begun to change with recent work on the role and nature of the instruments that have mediated between scientists and the objects of their study, … But, outside of the narrow circle of people who think of themselves as historians of computing, historians of science (and indeed of technology) have ignored the instrument that by now so pervades science and technology as to be indispensable to their practice. Increasingly, computers not only mediate between practitioners and their subjects but also replace the subjects with computed models. … Some time soon, historians are going to have to take the computer seriously as an object of study, and it will be important, when they do, that they understand the ambiguous status of the computer itself. (p. 31)

I would go further and say that not only historians, but philosophers, and for that matter other humanities disciplines, are going to have to take seriously the ambiguous nature of the computer as instrument and extension in all knowledge disciplines.
Continue reading Mahoney on the History of Theory in Computer Science

Women in Computer Science

Undergraduate Women in Computer Science: Experience, Motivation and Culture is a report on a study of women in computer science at Carnegie Mellon. While it is only a preliminary report it strikes me as balanced and interesting. Their initial findings include some reflections on what got men and women into CS – a number of male students talked about the computer as a toy or game that they got caught up playing with in an undirected way. Female students, by contrast commented on what they wanted to do with computing.
Continue reading Women in Computer Science

Error Correction

In a paper I gave in Georgia I picked up on a comment by Negroponte in Being Digital to the effect that error correction is one of the fundamental advantages of digital (vs analog) data. Automatic error correction makes lossless copying and transmission possible. Digital Revolution (III) – Error Correction Codes is the third in a set of Feature Column essays on the “Digital Revolution.” (The other two are on Barcodes and Compression Codes and Technologies.)
To exaggerate, we can say that error correction makes computing possible. Without error correction we could not automate computing reliably enough to use it outside the lab. Something as simple as moving data off a hard-drive across the bus to the CPU can only happen at high speeds and repeatedly if we can build systems that guarantee what-was-sent-is-what-was-got.
There are exceptions, and here is where it can get interesting. Certain types of data can still be useful when corrupted, for example images, audio, video and text – namely media data – while others if corrupted become useless. Data that is meant for output to a human for interpretation needs less error correction (and can be compressed using lossy compression) while still remaining usable. Could such media have a surplus of information from which we can correct for loss that is the analog equivalent to symbolic error correction?
Another way to put this is that there is always noise. Data is susceptible to noise when transmitted, when stored, and when copied.
Continue reading Error Correction