Fake Banner
Can We Prove That a Large System is Self-Organizing?

In my 2014 article about large systems I wrote that "what makes a system large is our inability...

How Consumer Computational Search is Changing the Internet

I don't like using the term "consumer" because it implies an economic function of the searcher...

Resonating Euler Spirals and Prolate Spheroids

You might call it a two-tone football.  If you're a real mathematician you may be able to...

Spinning Objects: A Process for Controlling Self-modifying Systems

I used to work with a programming language called Business Basic.  It was descended through...

User picture.
picture for Sascha Vongehrpicture for Helen Barratt
Michael MartinezRSS Feed of this column.

Michael Martinez has a Bachelor of Science degree in Computer Science, an Associate of Science degree in Data Processing Technology, and a few certifications in long-forgotten 2nd millennium technologies... Read More »

Blogroll
I have set out to write about "Naturality" for this column three times since the beginning of 2012 and each time I have abandoned the article.  It's a complex topic that easily gets away from me.  I have been measuring Naturality in search engine results for several years now.  Some years ago I devised a simple formula to help me explain Naturality to people:

1 = Ny + Ty + Oy
Not a day goes by where rain doesn't fall on the Earth.  Our weather patterns have changed through the eons and there have been wetter periods and dryer periods but so far as I know the rain keeps falling and will continue to keep falling until Earth can no longer sustain a liquid water environment.

The World Wide Web works much the same way.  As we add content to the Web we "hook it up" and link to something, and something often links back to us.  Even on the dark Web there are a lot of links.  But do links burst into existence at a continuous rate?  Is there a predictability to how much linkage we create with our new content?  I'm sure a few people have tried to estimate stuff like this.  But there's always a catch.

I am sure it is old news to most if not all of Science 2.0's readers that NASA recruited Vint Cerf to help adapt Internet technology to space mission communications in 2000; and that they successfully tested a new protocol in 2008. I have no doubt that the engineers can put together a pretty reliable interplanetary network.

"Search.  The final frontier.  These are the voyages of the frustrated Web surfer.  Its five-year mission: To explore strange new content, to seek out new ideas and new expressions.  To boldly know when someone is pulling our leg or being sincere." I'm not waiting for William Shatner to record that monologue but there are days when I can almost hear it rolling about in my head.  Search is such a universal thing for people -- we were born to it.  We resonate with memories of failed searches every time we hear someone gasp, "What did I do with my keys?"
Deep Web Interferometry compares curves in trendlines from synchronized multiple data sources.  Interferometric analysis of Web metrics data increases the clarity of meaningful data points by isolating events.  For example, given two Websites that track a popular sport, one Website may experience a weekly peak in traffic on Monday and the other site may have two smaller peaks on Wednesday and Saturday.  However, during a major tournament both sites experience sharp peaks during the games.  These game-driven data spikes appear in both sites' trendlines.
The universe I see when I sit on my porch and look up into the sky is very different from the universe a professional astronomer sees with all of today's available advanced technology.  On a clear night I may be able to see a few galaxies.  On any night astronomers around the world may be counting billions of galaxies.