Singularity University: Exponential Silliness 2.0?

Ray Kurzweil, who has been predicting “spiritual machines” (AI) for a while now, has been appointed Chancellor of the Singularity University. The Singularity University is based at the Nasa Ames and supported by Google (and Moses Znaimer, another visionary wannabe.) It’s mission is to focus on exponential advances leading to singularities where you get a paradigm shift. The Overview describes the aims of the University thus:

Singularity University aims to assemble, educate and inspire a cadre of leaders who strive to understand and facilitate the development of exponentially advancing technologies and apply, focus and guide these tools to address humanity’s grand challenges.

The University thus seems dedicated to a particular, and questionable view of technological development which looks to a future of dramatic paradigm shifts triggered by these singularities. For example, the goal of the Academic Track “Future Studies & Forecasting” is “cultivating the student’s ‘exponential intuition’ — the ability to fully grasp the magnitude of possible outcomes likely to arise in specific domains.” No room here for modesty or skepticism.

The University is not really a University. It is more of an institute funded by commercial partners and providing intensive programs to graduate students and, importantly, executives. I’m surprised NASA is supporting it and legitimating something that seems a mix of science and science fiction – maybe they have too much room at their Ames campus and need some paying tenants. Perhaps in California such future speculation doesn’t seem so silly. I guess we will have to wait until about 2045 when the intelligence singularity is supposed to happen and see.

But what is the Singularity? The Wikipedia article on Technological Singularity quotes I. J. Good as describing the “intelligence explosion” that would constitute the singularity thus:

Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.

The key for an intelligence singularity (as opposed to other types) is the recursive effect of the feedback loop when a machine is smart enough to improve itself. That is when we go from change (whether accelerating exponentially or not) to the independent evolution of intelligent machines. That is when they won’t need us to get better and we could become redundant. Such dramatic shifts are what the Singularity University prepares paying executives for and trains graduate students to accelerate.

It is easy to make fun of these ideas, but we need to be careful that we don’t end up confidently predicting that they can’t happen. Kurzweil is no fool and he bases his prediction on extrapolations of Moore’s law. Futurology will always be risky, but everyone has to do it to some degree. For that matter there do seem to be moments of accelerating technological change leading to dramatic paradigm shifts so we shouldn’t be so sure Kurzweil is wrong about the next one. I should add that I like the proposed interdisciplinarity of the Singularity University – the idea is that dramatic change or new knowledge can come from ideas that cross disciplines. This second organizing principle of the University has legs in this time of new and shifting disciplines. We need experiments like this. I just wish the Singularity University had had the courage to include academic tracks with the potential for critical engagement with the idea of an intelligence singularity. Why not a “History and Philosophy of Futurology” track that can call into question the very named premise of the University? After all, a real university should be built on an openness of mind we would call intelligence, not dogmatic certainty in a prediction.