I'm amused to see the extent to which the buzzword "singularity" has gained ground in recent years. When I was a student, singularity was a rather ordinary mathematical concept. Roughly speaking, if a mathematical function behaved normally except for certain particular values of its arguments, these special cases were designated as singularities. Then the word was applied to theoretical situations in which the normal laws of physics break down. The most famous case of a so-called space-time singularity occurs within black holes.
More recently, the word "singularity" has been used to designate an advanced case of AI [artificial intelligence], namely an ultraintelligent machine. If AI researchers were indeed capable of designing a machine that happened to be more intelligent, in general, than the brightest humans [which is a situation that has never yet arisen in practice], then we might expect this smart machine to be smarter than humans in various engineering tasks. Among other challenges, that machine could turn out to be extremely talented in the art of designing even smarter machines... which might give rise to a snowball effect. And the end result could well be a vastly intelligent machine of the kind referred to as a singularity.
A colloquium on this theme, called the Singularity Summit, has just been organized by the Singularity Institute for Artificial Intelligence in Palo Alto, California [location of Stanford University].
Many singularist believers predict that technological progress is accelerating at such a rate that ultraintelligent machines are just around the corner. Detractors, on the other hand, claim that the AI singularity concept is no more than harmless garden-variety science fiction. As for me, although I have the retrospective impression that AI research [which once interested me greatly] ran into a brick wall a couple of decades ago, I must say that the power of computing amazes me today in ways that I would never have imagined not so long ago. Consequently, I'm ready for anything.