After reading Sascha's excellent article [Robopocalypse Now] regarding the effect and direction of robotic/AI development and its coevolutionary influences, it occurred to me that perhaps a shift in how we view such developments could promote a more intuitive understanding of what is occurring.

Essentially my point is that we consider "tools" as coevolutionary developments which can be regarded in a similar manner as symbiotic relationships are in biology.  It is important to note that I'm not suggesting that we suddenly imbue inanimate objects or human inventions with all manner of traits or attributes unique to biological organisms.  Rather we can view them as elements that simply can't evolve on their iown.  Their "evolution" is directed by the species that benefits from their use.  

Symbiosis is a biological description of two species that exploit each other for individual benefit, while simultaneously conveying a benefit to the other.  It's the classic "win-win" scenario, and through co-evolution, each is influenced in its future biological direction.

If we consider this perspective, then we can consider any organism that is capable of exploiting tools, as following an evolutionary path whereby the tool conveys an advantage.  As a result, such tool-use becomes a requisite condition for future evolution, since those that don't use tools are at a decided disadvantage compared to other members of the same species.  It can be argued that since it provides a direct evolutionary advantage, then such usage also indirectly influences the genome to ensure that future generations all acquire similar skills and habits.

For many species, such tool use is simply opportunistic.  However when humans [or any other organism] is capable of creating specific tools, then we witness a coevolutionary influence on the tool itself. In other words, it changes along with the species wielding it.  Gradual improvements in such tools promote greater advantage to the user, and the tool undergoes changes in turn.  In short; it evolves.

This effect is significant since it has a direct effect on biological fitness since the historical advantage given in terms of hunting, obtaining shelter, etc can hardly be overstated.  Even something as simple as the use of fire can be argued to be a coevolutionary force that has transformed humans from sitting around a lit campfire, to sipping coffee in a heated house.  

In this respect, we can consider all the arguments about human invention and technological development as coevolutionary events that have improved human fitness, while simultaneously influencing the direction in which humans "evolve".  Consequently, it isn't merely a contrivance to suggest that human technology is as influential on the development of the species as the genome itself is [including determining which elements of the genome are selected for].

In effect, the coevolution of technology becomes another selection pressure and evolves with us.

Computing technology is such a variation since it interacts with the most significant biological trait that humans possess; our minds and intellect.  Just as our brain is responsible in compensating for our physical deficiencies [i.e. absence of claws, sharp teeth, etc.], so does our technology compensate for other deficiences, real or perceived.

In the case of computing technology, we are utilizing tools that allow us to quickly reference information that isn't present in our brains.  We can tap into new skills and knowledge without having to acquire it through experience or mentoring alone.  Therefore, the more adept we become at exploiting such technology, then the more we may gain a decided advantage over those that fail to.  We already have numerous examples demonstrating how our access to information influences our belief systems, reinforcing previously held ideas or forming new ones.  If the development of culture has shaped the evolutionary path of human development, then it follows that our technology is even more influential because it is shaping the future direction our culture takes.

It is no coincidence, that many of the concerns we envision for the future are a direct result of ready access to such technologies.  After all, why even discuss issues like the publishing of data for converting the H1N1 virus into a weapon?  Why concern ourselves with instructions for making bombs or even nuclear weapons?

In fact, we readily recognize the coevolutionary issues involved, because as the technology has evolved, so has the human ability to exploit it.  

It can also be argued that coincident with this technical evolution, humans have acquired a level of dependence that precludes previous knowledge of survival.  While humans have gained a decided fitness advantage through such developments, we can readily acknowledge that we've long since lost the fundamental skills and knowledge held by our ancestors.  In short, we have pursued a biological path of technological dependence from which there is no return.

As a result, we tend to take technology for granted, and as it continues to evolve, it will be subject to the direction of human intent as much as human activities will be governed by the technology.  In short, we've achieved a true level of technological or artificial symbiosis.  We do not exist, without our technology.

From this, we can see how Sascha's article accurately describes the evolutionary trajectory we are on, because it doesn't require our technology to accurately replicate "intelligence" nor does it even require that it be independent.  After all, our technology is equally as powerful as the individual wielding it.  They are indistinguisable.  

Once we shift our perspective in this way, we can see that the discussion regarding future "artificial intelligence" is fundamentally flawed because we have failed to grasp that WE are the "artificial intelligence".  With the technological extensions provided by our machines, it isn't necessary that we become "transhuman" any more than it requires our computers to achieve actual "intelligence".  It is the two together that constitute the artificial intelligence ... and in that regard, Sascha's article points to a serious problem we have overlooked.

The issues being discussed are often expressed as cautions that we need to be wary about at some future point in our evolution or technology.  They are here today ... and we are completely unprepared to recognize them.