Computer trends are interesting to follow[1]; they keep changing, and, as with clothing, the chic trends this year soon become passé, replaced by newer ones. It often seems that it’s really the words that change, while the actual trends continue pretty much intact.

Some years ago, we liked “e-Utilities”, then “autonomic computing”, later “on-demand computing”, and now “software as a service” (SAAS or SAS, depending upon who’s abbreviating it). To be sure, at some level these aren’t all the same thing at all. And, yet, when it comes to describing a way of providing computer services as needed, in a sort of plug-and-play manner, it’s easy to make your project or product fit them all.

In that sense, they become buzz words, and as the operative buzz words change, we spin our project proposals or our product advertising to take maximum advantage of the “new trend”.

So with “distributed computing”, “cluster computing”, “grid computing”, and “cloud computing”, terms that have developed over the last years. Each is distinct from the others in some ways, but there’s a great deal of overlap. A turn-of-the-century distributed computing application that has the right profile could easily have morphed through the series, proudly calling itself a cloud computing application today. There’s a lot of fluff here.[2]

Eric Rescorla gives an opinion on cloud computing over at Educated Guesswork, and I agree with him that it’s a mixed bag. All of the mechanisms in the list above have some of the characteristics Eric talks about, such as the ability to draw on more resources only when they’re needed, avoiding over-provisioning the system all the time. You could actually say that it works autonomically, or on-demand... but never mind.

What I think is interesting about the emphasis on cloud computing, and putting your data and services “in the cloud”, is that we’ve come close to completing a circle. In the 1970s, we used “dumb terminals” that talked to “mainframe computers”, behemoths that sat in large data centers.

The terminal was an input/output device, but was not itself a computer... so all the programs and the data lived and ran on the mainframe. We had central management of everything, and the only way to distribute the cost was to charge for use of mainframe resources — processor cycles, data storage, and so on.

In the 1980s, we developed personal computers and started using them seriously. The computer on your desk would run a “terminal emulator” that accessed the mainframe, but it also ran its own programs, starting to pull away from the central management. We did spreadsheets and word processing and that sort of thing without ever touching someone else’s computer.

We still stored data in the data center — it had far more capacity, of course — but we no longer stored everything there. And, too, some of the cost was distributed to the users, who paid for their own computers and software.

In the 1990s, as the worldwide web developed, we did more and more on our own computers, and relied far less in the data center, to the point that many people in offices — and pretty much everyone at home — made no use of it at all.

Of course, no one ran everything on her own computer, either. The whole point of the web is to make it easy to find and retrieve things from other computers on the Internet, and over time, more and more services became available to us.

But we ran our own browsers and office software and email programs and lots of other programs. And, as a result, we had to manage all that software ourselves. Be sure to update all your software regularly, we’ve been reminded, to make sure long-fixed program bugs don’t bite you. Upgrade periodically to get new features, keep your anti-virus definitions up to date, and remember to back up your hard drive regularly, lest you have a disk crash and lose all.

Now, in the 2000s, we’re moving back. Keep your backups at someone’s Internet data center — they’ll give you lots of free space, and you can pay for more storage and features. Next, keep your data somewhere else in the first place, using webmail, using “virtual hard drives” on the Internet. Then run your software somewhere else, with things like Google Docs — they’ll take care of storing your data, making sure it’s backed up, scanning it for viruses, making sure the software that uses it is properly updated....

What, now, is the real difference between computing in the cloud — or on the grid or whatever, in what we’ve come to call “federated” systems — and computing in the data centers of the 1970s? Google is talking, with its announced operating system that ties heavily into the cloud, of moving your PC even further back to a not-so-dumb terminal that, through a web browser, gets all of its data and services from what amounts to a data center.

30+ years ago, the data center was a large room with many large, noisy boxes; today, it lives in smaller, probably quieter chunks all over the world. And the circle is very close to being closed.
 


[1] Well, for some value of “interesting”, but bear with me here.

[2] Yes, “cloud”, “fluff”... sorry.