Rebooting Computing Manifesto

On the subject of manifestos, one of my students pointed me to a project Peter Denning is leading that has a Rebooting Computing Manifesto. The project is sponsored by the National Science Foundation (of the USA) and is aimed at trying to reinvigorate computer science in the face of dramatic drops in enrollment.

It is a time of challenges for the computing field. We are tired of hearing that a computing professional is little more than a program coder or a system administrator; or that a college or graduate education is unnecessary; or that entering the computing field is a social death. We are dismayed that K-12 students, especially girls, have such a negative perception of computing. We are alarmed by reports that the innovation rate in our field has been declining and that enrollments in our degree programs have dropped 50% since 2001. Instead of the solo voice of the programmer, we would like to hear from the choir of mathematicians, engineers,and scientists who make up the bulk of our field.

I like how this is articulated as a challenge. I also like the can-do approach of gathering and coming up with ideas.

The Pool

Screen shot of The PoolThe Pool is a project from the University of Maine new media group, Still Water who also developed ThoughtMesh. It is a collaboration between faculty and students that provides a visual space where ideas can be described (intent), approached and released. (The metaphor is fishing and releasing.) It encourages sharing, rating, and redevelopment of ideas. The have pools for code and art.

The Pool offers a very different message. This online environment is an experiment in sharing art, text, and code–not just sharing digital files themselves, but sharing the process of making them. In place of the single-artist, single-artwork paradigm favored by the overwhelming majority of studio art programs and collection management systems, The Pool stimulates and documents collaboration in a variety of forms, including multi-author, asynchronous, and cross-medium projects. (learn more -> purpose)

The Chronicle of Higher Education in New-Media Scholars’ Place in ‘the Pool’ Could Lead to Tenure (Andrea L. Foster, May 30, 2008, Volume 54, Issue 38, Page A10) discusses The Pool as an alternative form of peer review for getting tenure, which rather misses the point for me. What impresses me about this is the collaboration between students and faculty in experimentation around structured collaboration. The Pool could dry up, and some of the code pools seem rather poorly stocked, but that wouldn’t detract from what seems like thoughtful and sustained experimentation with social collaboration. The wiki, Flickr, Facebook and blog models of Web 2.0 social knowledge dominate our thinking about what is possible. The Pool reminds me that we don’t have to adapt successful models, there is room for new ideas. Catch this and release it.

NiCHE: The Programming Historian

NiCHE logoNiCHE (Network in Canadian History & Environment) has a useful wiki called The Programming Historian by William Turkel and Alan MacEachern. The wiki is a “tutorial-style introduction to programming for practicing historians” but it is could also be used by textual scholars who want to be able to program their own tools. It takes you through learning and using Python for text processing for things like word frequencies and KWICs. It reminds me of Susan Hockey’s book, Snobol Programming for the Humanities. (Oxford: Oxford University Press, 1985) which I loved at the time, even if I couldn’t find a Snobol interpreter for the Mac.

We need more of such books/wikis.

Now, Analyze That: An Experiment in Text Analysis

Image from Visual Collocator

Stéfan Sinclair and I have just finished writing up an essay from an extreme text analysis session, Now, Analyze That. It is first of all a short essay comparing Obama and Wright’s recent speeches on race. The essay reports on what we found in a two day experiment using our own tools and it has interactive handles woven in that let you recapitulate our experiment.

The essay was written in order to find a way of write interpretative essays that are based on computer-assisted text analysis and exhibit their evidence appropriately without ending up being all about the tools. We are striving for a rhetoric that doesn’t hide text analysis methods and tools, but is still about interpretation. Having both taught text analysis we have both found that there are few examples of short accessible essays about something other than text analysis that still show how text analysis can help. The analysis either colonizes the interpretation or it is hidden and hard for students and others to recapitulate. Our experiments are therefore attempts to write such essays and document the process from conception (coming up with what we want to analyze) to online publication.

Doing the analysis in a pair where one of did the analysis and one documented and directed was a discovery for me. You really do learn more when you work in a pair and force yourself to take roles. I’m intrigued at how agile programming practices can be applied to humanities research.

This essay comes out of our second experiment. The first wasn’t finished because we didn’t devote enough time together to it (we really need about two days and that doesn’t include writing up the essay.) There will be more experiments as the practice of working together has proven a very useful way to test the TAPoR Portal and think through how tools can support research all the way through the life of a project, from conceptualization to publication. I suspect as we try different experiments we will be changing the portal and the tools. too often tools are designed for the exploratory stage of research instead of the whole cycle right to where you write an essay.

You can, of course, actually use the same tools we used on the essay itself. At the bottom of the left-hand column there is an Analysis Tool bar that gives you tools that will run on the page itself.

T-REX: TADA Research Evaluation Exchange

T-REX logo

Stéfan Sinclair of TADA has put together an exciting evaluation exchange competition, T-REX | TADA Research Evaluation Exchange. This came out of discussions with Steve Downie about MIREX (Music Information Retrieval Evaluation eXchange) and our discussions with the SHARCNET folk and then DHQ. The initial idea is to have a competition for ideas for tools for TAPoR, but then to migrate to a community evaluation exchange where we agree on challenges and then compare and evaluate different solutions. We hope this will be a way to move tool development forward and get recognition for it.

Thanks to Open Sky Solutions for supporting it.

Brooks: The Mythical Man-Month

Adding manpower to a late software project makes it later. (Brooks Law, p. 25)
The Mythical Man-Month by Frederick P. Brooks Jr. was first published in 1975, but it still reads easily and wisely. At its heart is an understanding of how programming is a human task which takes communication to do in teams. As Brooks puts it, “Men and months are interchangeable commodities only when a task can partitioned among many workers with no communication among them.” (p. 16) For this reason you can’t just throw more people at a project to get it finished faster. Some other quotes:

The bearing of a child takes nine months, no mater how many women are assigned. Many software tasks have this characteristic because of the sequential nature of debugging. (p. 17)

Even at this late date, many programming projects are still operated like machine shops so far as tools are concerned. Each master mechanic has his own personal set, collected over a lifetime and carefully locked and guarded – the visible evidence of personal skills. Just so, the programmer keeps little editors, sorts, binary dumps, disk space utilities, etc., stashed away in his file.

Such an approach, however, is foolish for the programming project. (p. 128)

A computer program is a message from a man to a machine. The rigidly marshaled syntax and the scrupulous definitions all exist to make intention clear to the dumb engine.

But a written program has another face, that which tells its story to the human user. (p. 164)

building software will always be hard. There is inherently no silver bullet. (p. 182)

The complexity of software is an essential property, not an accidental one. Hence descriptions of a software entity that abstract away its complexity often abstract away its essence. (p. 183)

In the chapter “The Surgical Team” he proposes that programming be done in a small team like a surgical team around an experienced programmer. The team would have a Chief Programmer, Copilot, Administrator, Editor, Secretaries, Program Clerk, Toolsmith, Tester and Language Lawyer. The Editor would help document code. The Copilot would work closely with the programmer discussing design with her – which sounds like extreme programming. The Toolsmith makes sure the computers are running right. The Language Lawyer would be the resource for the programming language.

Bubbles, Fluid and Prism: Site Specific Browsers

Bridging Desktop And Web Applications, Part 2 is a longish post about Site Specific Browser technologies like Prism, Bubbles and Fluid. I blogged Prism before – Peter O sent me this new link and one to Why We Need Web Apps on the Desktop.

It would seem that will always-on networking it becomes feasible to create applications that combine a customized browser with a server application.

The Spectator’s View of Web Standards

One of my favourite software writers/bloggers is Joel Spolsky: he is thoughtful, funny, and knows how to tell a story. Yesterday he posted a longer-than-usual disquisition on the upcoming web-standards smackdown that will follow on the heels of the release of Internet Explorer 8.

My sympathies tend to fall with the standards purists (though the need to deliver a product forces me to appreciate compromise), I find the elegance of good abstraction irresistible and standards compliant design makes for more stable, comprehensible, editable and elegant sites (from the perspective of the developer, that is: I’m saying nothing about how anything looks to the actual eye…). And there’s a large and vocal community that shares this attitude. The nagging voice of reason, however, (and I am only assuming it is the voice of reason, I haven’t mentioned this to a psychiatric professional) does frequently ask “Is this semantic markup?” The practical distinction between ‘presentation’ and ‘logic’ only looks clear from the periphery; the middle ground is big and grey and muddy.

So, Joel’s remarks on the casual meaning of ‘standards’ when applied to web development are, I think, appropriate, and his story illustrating the history of incremental standards compromise in the service of progress is undeniable (except, perhaps, to a fanatical idealist). His pragmatic arguments that 1) there is no practical web-standard benchmark against which to measure browser compliance, 2) that the expression of standards specifications in W3C documentation are frequently impenetrable, and 3) that Microsoft like any other company has to maintain the good will of their existing customers by supporting legacy products and document formats in new products, are all well argued and substantially acceptable. It is almost enough to make me feel some sympathy for Microsoft. Almost.

Of course, talking about IE is not quite like talking about Word, where the evolution of the document format is bound to the product alone; any web developer will ask why there are so many fewer discrepancies found on a first test of a site architecture between FireFox, Safari, and Opera than between any of these and IE6 (indeed, a measure of the improvement in standards compliance of IE7 is that there may now be more discrepancy between IE6 and IE7 than between IE7 and the other major browsers (maybe)). Surely at least some of the blame for the whole fracas with respect to IE and the rise of web standards fanaticism rests with Microsoft’s historical unwillingness to accept any general standards not of their own making. (Witness ODF vs OOXML as just one example.)

I’ll stop there and leave the flaming for other, more capable participants. In the end, one can’t really disagree with Joel’s point that the demand by compliance fanatics within Microsoft (I know, the very idea of their existence leaves me a little breathless) that IE8 be so rigid in it’s adherence to standards based code that only 37% (or whatever number…) of existing web pages will accurately render is just silly. The plea one wants to make is for the middle path: too much unpredictability in a platform will hinder development and so will too much inflexibility: the question is “how much is too much?”. We complain about caprice in the rendering decisions of various browsers (some more than others), but it is almost certainly a good thing that we are required to reinvent from time to time; the human impulse is to improvise and the best measure of our ingenuity is our capacity to swede the world. (Well, I liked the “be kind, rewind” site so much I had to work it in somewhere.)

Extjs

Javascript frameworks have been with us for a while now and anyone developing standards-based web interfaces has probably learned to love one or another of them. Beyond abstracting common chores like javascript native object prototype improvements, DOM tinkering, Ajax object management and so on (which, I daresay, is pretty darn appealing on its own), they shield the developer from the maddening caprice of cross-platform/cross-browser compatibility issues.

My favourite from the growing list of libraries has recently been mootools, I like it for it’s consistency, simplicity, and style. Lately, however, I have been checking out Extjs. Created by Jack Slocum, Extjs was originally an extension to the Yahoo UI, but has been a stand-alone library since version 1.1. Ext 2.0 was recently released.

While Ext does pretty much everything the other libraries do, a bit of poking around reveals an astonishing wealth of features. Ext is designed as an application development library, where most of its competitors are better described as utility libraries. Though Ext features a bunch of impressive application management classes like internal namespace management and garbage collection, as well as a vast range of function, object, and DOM extension classes, what draws most developers to Ext is its collection of exquisite controls, most popular of which is probably Ext’s beautiful data grids.

Ext panels, tree and grid

Ext’s grids, trees, form panels and window layout panels all have themable styles included so they look great out of the box. The control classes also feature powerful configuration parameters, like XHR URL fields (where applicable), data store record object reference, data field formatting and so on.

For casual developers, getting past “Hello world” with Ext is intimidating and it requires some persistence to get comfortable with, but the payoff is a serious arsenal of high-performance development tools for producing powerful, stable, good looking web applications. The Extjs site has numerous tutorials and excellent API documentation. Check out Jack’s description of building an Ext app using Aptana, Adobe AIR and Google Gears.

In Search of Stupidity, over 20 years of high-tech marketing disasters

In Search of Stupidity, over 20 years of high-tech marketing disaster is an amusing book about the marketing and development of commercial software by Merrill R. Chapman. Some of the chapters deal with poor decisions by word-processing companies like MicroPro that ended up with two competing products (WordStar 3.3 and Wordstar 2000) and completely different programs. MicroPro International, according to Chapman was in 1983 the largest microcomputer software company with close to 70 million in sales. The problem was they the WordStart programming team was fired (or quit) and a new team bought up had a different word-processor in development.

One thing this book documents well is the battles between the management/marketing folk, on the one hand, and the developers, on the other. The fault does not always lie with the marketing folk. Chapman describes situations where the developers decide to totally redevelop a product from the ground up when the market is expecting a timely upgrade. Philippe Kahn of Borland, for example, decided to redevelop Paradox completely in object-oriented code and ended up alienating his users just when Microsoft released Access.

The one company that stands out as consistently avoiding fatal stupid mistakes is Microsoft which may explain why they are now so much bigger than any other software company. That Microsoft had an experienced programmer as lead probably meant there was never the sort of disconnect that doomed other software companies.

The book is partly a response to In Search of Excellence which lauded a number of high-tech companies as having excellent coporate cultures. Unfortunately many of the “excellent” companies didn’t last … hence the search for stupidity.

Check out their Museum Exhibits of stupid marketing.