Several times during the last few weeks I've encountered articles about a new book written by Ray Kurzweil entitled "The Singularity Is Near: When Humans Transcend Biology". Mr. Kurzweil is a noted futurist who has done much work since the early '80's harnessing digital technology in the service of music and text-to-speech synthesis. His new book presents the provocative idea that computing processing technology is accelerating at such a rate that sometime between the years 2030 and 2045 mankind will have created machines that are more intelligent than we are. At that point in time (the "Singularity" that he refers to in his book's title) our relationship with our machines will be radically and irrevocably altered and will ultimately allow us to, as he puts it, "transcend our biological limitations and amplify our creativity". Although he wasn't the person who originated the idea of the Singularity or coined the term (SF writer and academic Vernor Vinge was first -- see his article here), Kurzweil appears to be the first to explore specific technological trends that he believes will lead to that event and published them in popular book form. His Web site can be found here and an interesting interview with Kurzweil discussing the book and the idea of the Singularity can be found here.
Now upfront I have to admit that I have not ready his book yet (although I plan to) but his ideas are so interesting that I didn't want to wait until then to write my impressions and thoughts. Although his thoughts on nutrition are pretty bizarre, I think his positions regarding technological progress are credible. The key ideas that will lead mankind to the Singularity are his "law of accelerating returns" (see his article here), a continued extrapolation of Moore's Law into the future, the continued improvement in application of genetic engineering, and the blossoming of nanotechnology. He takes the position that the melding of machine-assisted intelligence with our own will create a species of unrecognizably high intelligence, creativity and memory. While outwardly he presents this as neither utopian nor dystopian (because what will come after the Singularity is "unrecognizable" there's no way to tell), his tone is definitely such that he thinks this is a good thing.
I don't know enough about the entire book to comment on most of his contentions. But one thing struck me as I first began reading about the book. As any programmer will tell you, having ultra-fast hardware means nothing without software to power it. To make computers do anything we have to have an algorithm. And in the case of the machines that enhance or transcend our own brainpower, I think we are a long, long way from having an algorithm that emulates "intelligence". Sure, we have algorithms that allow our computers to perform very specific things that have the appearance of intelligence (such as playing chess), but most of the time these simply represent applying brute force computing speed to assist. We currently don't have a deep-enough understanding of how our minds work to be able to define an algorithm that is a reasonable facsimile of intelligence and I frankly think we are a long way off from finding one. And this lack of progress has happened in spite of studying human intelligence intensely for over 50 years.
While I can see the Singularity happening some time in the future, I think that the 2030-2045 time frame is too soon. My intuition tells me that it will take some sort of revolutionary idea or new application of mathematics (along the lines of chaos theory) to allow us to determine one.
Scary or not, I believe the event Ray Kurzweil calls the Singularity isn't something we need to be concerned about for a long time.
Music that got this post out: "Gimme That" -- The Resource featuring Jimmy Napes