A meeting of the San Diego Software Industry Council. The subject: What will Web 3.0 look like?

“Why don’t we know?” asked one venture capitalist. “Are we idiots?”

The VC was pandering to an in-group audience. The implied answer was, No, we’re not idiots, we are successful, sophisticated investors, entrepreneurs, and scholars of the web!

But y’know what? The truth is, yes, we are idiots. We are idiots because of technology colonization, and we fall for it every time.

Fifteen years ago the World Wide Web came along, and what did we do with it? We used it for push-publishing, for banner ads, and to sell stuff from web storefronts. In other words, we treated the WWW like an electronic magazine, or another television channel.

When we finally figured out the really new stuff the web could do for us – social networking, crowdsourcing, tweeting, distance learning, scientific collaboration, flashmobs and blogging – we were so pleased with ourselves that we gave these a new, collective name: Web 2.0.

We should have called it YoucantdothatwithnewspapersorTV. It took a long time to figure it out. But in retrospect it looks, well, idiotic to have used the web to push text and photos in just one direction, with the only possible feedback being the “Buy Now” button. Why were we idiots? It was because the magazine and TV concepts, and the newspaper meme, colonized our perception of the WWW.

I first saw the phrase technology colonization in a 1995 Wired article[1] authored by Barry Diller. Media mogul Diller, then CEO of QVC (and now head of IAC/InterActiveCorp), urged us to resist “media imperialism,” the tendency to define a new technology in terms of the old.[2]  After all, he said, the automobile proved to be much more than a horseless carriage!

Those sneaky agents of technology imperialism, Texaco Theater, Milton Berle, and Ed Sullivan, held television back. Their shows were nothing but kinescoped stage plays and vaudeville. (Good plays and vaudeville, but that’s not the point.) Edward R. Murrow, George and Gracie, and Sid Caesar and Imogene Coca cut TV loose from the old model, showing us what TV could really do, as an eye to take the viewer into other people’s homes, a vehicle for special effects, and a brave new world in which we watched ourselves watching each other.

The first automobiles were called horseless carriages because the carriage concept colonized the automobile concept, in the minds of producers and customers, limiting the auto’s early usage modalities. By and by, users discovered that autos could perform functions that horse-drawn carriages could not, and society and infrastructure evolved accordingly. In fact, everything changed, from war (guns mounted on motor vehicles) to love (babies conceived in the back seats of ’56 Chevys parked at Lookout Point).

The colonization metaphor is apt. Think of the Jamestown colonists, who tried to build a little England in Virginia, an ill-conceived and ultimately fatal idea. In contrast, the Plymouth colonists – who actually called their place New England – mustered a bit of adaptability, shared a meal with the locals, thought “This isn’t England, what can we make of it, hey let’s build a park for the Red Sox, maybe recruit some eggheads and start a university or two.” Theirs was America 2.0.

The Internet can do things that the post, the phone system, and town hall meetings cannot. But in the Net’s early days, the post, the phones, and in-person meetings had colonized the Internet in the mind of the market at large. I conjecture that this was one reason for the dotcom bust. The only online businesses to survive the bust helped people perform familiar, cozy functions – auctions, romantic introductions, job searches – more easily and quickly. Of course new net businesses sprang up in the post-bust recovery, but few people are using the net to do truly new things; to this day, everybody loves joining LinkedIn, but nobody really knows what to do with it.

Newsweek columnist Daniel Lyons agrees that journalists have used the Net "to do the same old thing. We take stories from newspapers and magazines and put them on Web sites. We publish books on Kindle. We put TV shows on Hulu." Lyons believes Apple's rumored-to-be-forthcoming tablet computer, with its always-on Internet, will usher forth "phase two of media on the Internet," mashing together the styles of print and video journalism - but not, he says, until someone born and raised with the medium does the mashing. "Somewhere out there," Lyons writes, "the Orson Wells of the digital age is [now] in grade school."

I hope to see research that sheds light on when Internet users (and users of any revolutionary technology) will throw out the colonizers. The colonizers are not malevolent people, but rather, old concepts of usage and functionality!

Leibowitz[3] thinks the Internet bubble burst because we ignored basic economic principles. I think it burst because we ignored colonization. If we can develop a rigorous understanding of colonization, it will be a better leading indicator of the Internet than the fevered gossip in Silicon Valley bars; we can answer the question (notwithstanding that some consider it already answered) of whether Meg Whitman, formerly of eBay, paid too much for Skype.

Taking the topic farther afield, Davis[4] echoes Heidegger and Ellul in maintaining that our primeval myth-stories and images are the primary drivers of our technology choices. That is, our myths colonize our technologies, too – not just vice versa.

Among Davis’ most powerful points is that members of pre-technological cultures perceived themselves as integral parts of an animated world, in which each stone and tree harbored its own benevolent, mischievous, or malicious spirit.  Nanotechnology and advanced electronics will once again complete the interpenetration of the magical and the technical, as today we move toward a techno-animated world where, as in a Disney cartoon, every teakettle will dance on mechatronic legs, sense with silicon/DNA circuits, and speak in a synthesized voice.  We can already buy Internet-connected smart refrigerators that monitor milk and order orange juice.[5]

Medical and biotech advances lead us to create Frankenstein monsters, simulacra of life that echo the mythical Golem.

It gets weirder: Unfamiliar technology, in turn, colonizes the mythic imagination. It may do this in the form of demons, etc. Alien abduction hallucinations constitute an example; they are anxiety dreams about our own mutation. Davis quotes cybertheorist Michael Heim: “We experience our full technological selves as alien visitors, as threatening beings who are mutants of ourselves and who are immersed and transformed by technology…”

Twilight Zone stuff aside, Diller believes we should be “convergence contrarians,” willing to challenge conventional wisdom, able to explore other possibilities, treating a new medium on its own terms. Diller has been spectacularly right (Laverne and Shirley; Cheers) and spectacularly wrong (In 1995 he said the arrival of cable did not signal the end of the networks. This month, a much-weakened NBC was sold to Comcast). Is he right about colonization? I believe so.

Most of us suspect Web 3.0 will be the “semantic web.” How long will we be hobbled by our understanding of Web 2.0, unable to see the true potential of the semantic web? What kinds of dippy, unimaginative apps will characterize the early semantic web – and more to the point, how soon will we let ourselves say goodbye to them, and move on to the good stuff?



[1] Diller B (1995)  Don’t Repackage – Redefine!  Wired February.
[2] Sadeg M. Faris pushes a different definition of “technology colonization.” He uses the term to mean a severe trade imbalance caused by the technological superiority of one of the trading partners. His is an important concept, but it is quite unrelated to Diller’s.

[3] S.J. Leibowitz, Re-Thinking the Network Economy, AMACOM Division of the American Management Association, 2002.

[4] Davis, E. (1998) Techgnosis: Myth, Magic and Mysticism in the Age of Information. New York: Three Rivers Press.

[5]F. Phillips, “Technology and the Management Imagination.” Pragmatics&Cognition, 13:3, 533-565, 2005.