I've been trying to figure out why "transhumanists" and their predictions irritate me so much.  Perhaps it's seeing humanity reduced to a simplistic engineering problem to be solved.  Or maybe it's the love affair that appears to be happening between them and technology.  Or maybe it's the fact that they all sound like psychics, except that their subject is the human race instead of an individual. Back in July, Massimo Pigliucci addressed some of the problems with transhumanism, however it seems that this is a rather persistent idea. The following quotes all come from a paper by Ray Kurweil where he addresses his vision of "The Singularity"(1) coming.

Whatever else it is, it seems that I can't help but feel that transhumanists aren't taking their subject very seriously and that whatever insights they may legitimately have are overshadowed by their complete lack of respect for life. This is aptly exemplified by Ray Kurzweil and his thoughts on the coming Singularity.

"For inventor and futurist Ray Kurzweil, being human with limited intelligence and doomed biology was never good enough."
http://www.newscientist.com/article/mg20227076.200-ray-kurzweil-a-singular-view-of-the-future.html

Doomed biology?  Limited intelligence?  Based on what?

It would seem that it is these pronouncements which lead me to conclude that the transhumanist view of human biology is that of an engineering problem that needs fixing.  Even the basic problem of defining intelligence isn't addressed.  But without knowing what intelligence actually is, how can we know what it means to have more of it?  It's always presumed that this would be a good thing, but how can we even know?

What is all of this based on?  Chip design and manufacture.  Apparently computer chips are the ultimate salvation of humans and it is this technology which will foster in a "golden age" of human existence that we can't begin to imagine.  Fortunately for us, there are some that can imagine it and are prepared to dive headlong into making predictions about how this will provide future solutions (for an as yet to be described problem).

"Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light."
http://www.kurzweilai.net/articles/art0134.html?printable=1

I find statements like this lacking in the most fundamental understanding of what it means to be human and of biology.  Surely these "transhumanists" can't be so naive as to believe that a true intelligent machine with superior intellect will be content to simply serve humans?  I'm not invoking some classic Hollywood imagery, but the reality of biological evolution clearly states that no species will sacrifice for the survival of another.  If true machine intelligence should ever be achieved, then make no mistake, it would be the creation of a new non-biological species and all that entails.

Even the implications of merging biology and nonbiological intelligence and "immortal software-based humans" brings to mind all manner of horrific outcomes and frankly reeks of a technology based eugenics program. What would it mean to have a million, or ten million, or a hundred million people that have had such a "conversion" contrasted with the billions that haven't?  Are we to assume that we'll all just wait patiently in line for our "upgrade" before life continues?  I suspect most of us already have a strong inkling of how such scenarios have played out historically.

The response to these concerns is equally telling:

"My view is that the likely outcome is that on the one hand, from the perspective of biological humanity, these superhuman intelligences will appear to be their transcendent servants, satisfying their needs and desires. On the other hand, fulfilling the wishes of a revered biological legacy will occupy only a trivial portion of the intellectual power that the Singularity will bring."
http://www.kurzweilai.net/articles/art0134.html?printable=1

So apparently it's going to be fine that these super-intelligent machines will be quite content to be our servants and that our "revered biological legacy" is so trivially unimportant that it will occupy only a tiny portion of our own vast intellects.

One problem that is already apparent is the complete lack of understanding regarding intelligence:

"Before addressing this issue, it is important to note that once a computer achieves a human level of intelligence, it will necessarily soar past it. A key advantage of nonbiological intelligence is that machines can easily share their knowledge. If I learn French, or read War and Peace, I can't readily download that learning to you. You have to acquire that scholarship the same painstaking way that I did. My knowledge, embedded in a vast pattern of neurotransmitter concentrations and interneuronal connections, cannot be quickly accessed or transmitted. But we won't leave out quick downloading ports in our nonbiological equivalents of human neuron clusters. When one computer learns a skill or gains an insight, it can immediately share that wisdom with billions of other machines."
http://www.kurzweilai.net/articles/art0134.html?printable=1

The author clearly can't distinguish between intelligence versus data storage. How does the process of learning and developing ideas and opinions occur when everything is simply treated as a data storage problem?  One doesn't read War and Peace as an act of memorization, but rather to absorb the ideas and to formulate thoughts about the story. To treat learning as if it is simply an act of downloading information is foolishness.

"A computer can also remember billions or even trillions of facts perfectly, while we are hard pressed to remember a handful of phone numbers. The combination of human level intelligence in a machine with a computer's inherent superiority in the speed, accuracy, and sharing ability of its memory will be formidable."
http://www.kurzweilai.net/articles/art0134.html?printable=1

This is where things get really out of hand by suggesting that computers can "remember" anything.  The computer "remembers" phone numbers in the same way the phone book does.  To suggest otherwise is either disingenous or ignorant. As for accuracy, someone of Mr. Kurzweil's background should know better.  Since computers are incapable of assessing ANYTHING they process, they can only produce results consistent with what was stored in the first place.  

Of course, this leads to the next abuse of technology which is to suggest that such machine intelligence can be derived by reverse engineering the brain.  This simply suggests that the brain is viewed as a piece of hardware without regard for anything that it actually contains.  

This finally degenerates into the ultimate fantasy; what to do about our physical bodies:

"There are a variety of bodies that we will provide for our machines, and that they will provide for themselves: bodies built through nanotechnology (i.e., building highly complex physical systems atom by atom), virtual bodies (that exist only in virtual reality), bodies comprised of swarms of nanobots, and other technologies."

"Is this really me? For one thing, old biological Ray (that's me) still exists. I'll still be here in my carbon-cell-based brain. Alas, I will have to sit back and watch the new Ray succeed in endeavors that I could only dream of."
http://www.kurzweilai.net/articles/art0134.html?printable=1

One can't help but wonder where the reliability in software is going to magically appear from.  The mind boggles at the incentives created at the possibility of "hacking" into someone's brain.  Of course, this begs a more fundamental question.  Will such superhuman intelligences paired with humans be content with the jobs that still need to get done?  After all, not every job involves quantum mechanics.  Why bother going to school, when we can just implant our 5-year old with all the knowledge they'll ever need.

I guess we'll all end up sitting around making fun of Einstein.

Welcome to the future.

(1) The singularity is what will happen when an explosive advance in technology unexpectedly leaves us humans behind
http://www.newscientist.com/blog/technology/2008/06/how-to-spot-technological-singularity.html