In my October 4, 2011 Science 2.0 article “A Search In Time Is A Memorable Path” I defined Chronocity as “the measurement of the distance in Time between where we are (Now) and where we were at some point in the past (or where we will be at some time in the future) with respect to a specific object, or document.”  In other words, Chronocity is the degree of change across a series of sequential states determined through discrete time slices.

Like many others analyzing large data sets, Internet marketers use time-slicing to look for trends.  I use time-slicing to look for trends in Web data, particularly in the composition of Websites, methods of link placement, and the interaction of disparate Web services.  For example, in the 1990s people hand-coded static HTML files and wove them together with hyperlinks to build Websites.  In the early 2000s we saw the rise of simple content management platforms which gave way to fully robust blogging platforms, Web forum software, and more recently social media sites.  Whereas in 1995 an enthusiast might have constructed a “fan” Website by hand, by 2005 these were being published as blogs, and today they are more commonly built through Tumblr and Facebook.

Although I deplored the loss of value (through the passage of time) conferred by the Web That Was in my article, I have also been forced by demand and circumstance to look ahead – to attempt to predict what the Web That Will Be may look like.  Obviously that isn’t easy but happily I developed my theory of Naturality (Cf. "Naturality: If We Fight the Power Laws, Who Will Win?", May 31, 2012) to help me compartmentalize the Web That Is.  And by integrating Chronocity with Naturality I can observe transformations in sets of data divided by states, from which practice I developed my ideas about Deep Web Interferometry (the comparison of trend lines from multiple disparate data sources).

My ideas are hardly original.  For example, Clay Shirky wrote about the Power Law phenomenon (that I described in my article) years ago.  I learned about the prevalence of power laws in Internet data from an article by Mike Grehan called “Filthy Linking Rich” in which he described the Law of Preferential Attachment (the more search visibility a Website possesses, the more links it will attract, thus enhancing its search visibility).  I have borrowed freely from Percolation Theory, Swarm Theory, and just about every theory I can find in order to refine my models of the evolving complexity of the Web.

It is the transformations on Naturality that have proved to be the most productive tools for me, although I must admit I leave much to be desired as a computer model analyst.  By isolating states of Naturality, Transparency, and Opacity (from my formula 1 = Ny + Ty + Oy) I am able to plot trend lines for changes in sets of objects (Chronocity Sets).  Those Chronocity Sets might be search results for specific queries, collections of Websites, types of Web publishing platforms, features of Websites, features of Web publishing platforms, Web publishing services, backlink profiles, or anything else you can imagine.

A Chronocity Set consists of a collection of objects together with a vector of discrete states for each object as measured from Time 0 (the initial state of the set) to Time S (the final state of the set).  (Alternatively, you could describe it as a vector of discrete states of a set of objects.) In other words I’m drawing curves on a chart divided into time slices (hours, days, weeks, months, years, etc.).

Over time every snapshot of any collection of objects passes through transitions that can be measured discretely (although I am well aware of the challenge that discrete analysis faces with indiscrete transitions).  These transitions – when strung together as simple vectors – create the trend lines that all show me essentially the same story: Naturality decreases as Optimization increases.  When I first wrote about this concept several years ago I named it the Theorem of Search Engine Optimization but I have since found that it applies well outside of measuring the natural-to-unnatural ratios of search results.

Although I divide Optimization into Transparency and Opacity sub-sets it can simply be defined as “Whatever was not part of the Natural collection at Time Slice 0 in the Vector defined by Then-to-Now”.  In other words, any change in the Naturality of an object makes the object unnatural with respect to the initial state of the set.

As best I can determine, my theorem only works for closed systems.  That is, in an open system you could have an endlessly growing supply of Natural objects that are gradually transformed into unnatural objects (but there may be a derivative property I have not discovered which sustains the principle of declining Naturality).  The Internet – no matter how much it grows and changes -- remains a finite thing; it always has been and always shall be (because of the constraints of our technology) a closed system.  Not surprisingly, the theorem holds for every measurable set to which I have applied it on the Internet.  (Note: One might reasonably conclude I think otherwise based on my August 16, 2011 Science 2.0 article "Seeking Shape and Cardinality in the 0-dimensional Web" but I must defer this argument for another time.)

I know that’s hardly a scientific proof of anything but building upon this idea I realized that if I measure Naturality across discrete transitions of a set I eventually run out of Natural objects.   And that left me facing the difficult question of “What is Natural?”  Blogs are unnatural in 1995, natural in 2005, and may be obsolete in 2025.  So are blogs really natural to the Internet or not?  Obviously the answer to the question has to be bounded by definitions that make the question relevant to something – hence, I turned to Chronocity to provide those definitions and boundaries.

In other words, Naturality is really determined by whatever exists at time slice 0 in a discrete or finite sequence of time slices S.  You can construct a timeline where you reset the definition of Naturality after S iterations such that Naturality would always start out at 1 and decline to 0.  Of course, this transformation does not always happen at the same rate, so S may annoyingly vary across multiple segments of a timeline.  Nonetheless, S is always (in my experience) finite and terminable.  (Note: I have not explored the dependence of S+1 upon S but there is obviously an evolutionary path in the transitions.)

Think of how life has changed over geological time: we believe we had single-celled life for an immensely long period of time, then multi-celled life emerged and prevailed, then complex multi-celled life emerged and prevailed and then we had plants and animals, and those plants and animals evolved, etc.  Each successive stage of types of life that dominate the ecosystem seems to be shorter than the previous stage – and that, I think, is because that each time the ecosystem resets its has more complexity to work with than it did previously.  In other words, complexity is Nature’s form of optimization.  The universe is evolving into ever more complex superstructures as it organizes itself.

Human accumulation of knowledge has tracked a very similar course over the millennia: At the beginning of human existence (which probably did not have a discrete initial state) we mastered a small number of very simple tool-making techniques, food acquisition methods, and cultural traits; in successive iterations of human development – as our complexity increased – we acquired more knowledge, more skills, and more sophisticated cultural traits; eventually we reached a point where we could construct large permanent communities, which evolved into civilizations; those civilizations have since passed through stages of development and complexity that led to the Industrial Revolution, the Technology Revolution, and finally to the civilization and culture we have today.

At every stage in these timelines Naturality decreased as Optimization increased; but eventually whatever had once been Natural became obsolete or vanished altogether and essentially Nature reset its definition of Naturality.  Or, rather, if we want to analyze these progressive stages in discrete sequences then WE must reset our definitions of Naturality.  That might mean that the pollution we create today transforms the world into an environment where we cannot exist but which supports a different type of life – and that altered environment and different life would be “natural” for that time frame.

I have sometimes thought I should call my theorem the Theorem That States the Obvious but it does help to think in terms of Transparency and Opacity rather than in terms of Optimization.  I have found that as Naturality declines Transparency and Opacity may bounce up and down, fighting for dominance.  By the time Naturality is reduced to 0 (or near-zero) either Transparency or Opacity is usually dominant (so there is a power law at work even in this model because they are never equal when an iteration of S completes).

In Web marketing Transparency usually wins out in query spaces, Website design preferences, and backlinking practices simply because serious marketers don’t want to invest resources in being opaque; however, Opacity usually wins out in social media participation and search reputation management because people don’t want to be “caught” or “visible”.  These are very real divisions in the transformations of Naturality although I have no better explanation for why the dominance plays out this way than what I have written here.

Optimization may be defined as collective organization of resources and strategies with the goal of changing the natural state of the system.  A hunter improves its hunting ability or finds a better hunting ground to improve its chances of survival, thus assuring it an active role in the system; competing hunters duel each other until one leaves the system or one assumes a new role in the system.  Prey, in order to survive, may turn to hunting – thus competing with the hunter(s) already in the system.  That’s a very simple metaphor but it describes market economics rather succinctly, in my opinion.  

The Chronocity Sets and the objects in these Chronocity Sets have distinct properties or limits.  For example, Chronocity Sets never revert to a previous state.  Websites and links might change, search engines might revert their algorithms to earlier versions, etc.  Nonetheless, Naturality does not return to a prior state, though S may reassemble itself into something that looks like a prior state (but with different components); hence, Chronocity Sets are Morphing Sets – that is, objects can leave the sets or enter them at any Time Slice.  The sets can change in size.  Any object that joins the collection is Natural if and only if it would have been similar to 1 or more objects in the set at Time Slice 0.  Otherwise the new object is unnatural and must be classified as either Transparent or Opaque.

Despite these chaotic properties there are finite limits to the size of Morphing Sets – at least, so far as I have seen.  That would be an interesting question: Is it possible to have an unbounded Morphing Set in a closed system?  I suspect that might run afoul of some well-established principle or axiom (similar in form to Entropy or Conservation of Mass), but a bounded (morphing-capable) set cannot exceed the available resources of a closed system.   The possible number of permutations of the Morphing Set’s composition (of objects) at any available Time Slice is finite.

It is necessary to use Morphing Sets to reflect the limitations of measurement for a closed system like the Internet because no matter how hard we try we cannot determine its real boundaries.  However, I do not use Morphing Sets to introduce new things into a system – I am simply measuring a subset of the entire system at a given time point.  To put it another way, my work is not concerned with how the stuff got onto the Internet; I’m only dealing with what I find there, even if it’s relatively new.  Once it’s measurable it is included in the closed system.


I suppose you could call this the Paradox of the Internet, that it is both a closed system and a growing system (but the paradox is easily resolved by asking whether you are looking at a single moment or a span of time in the Internet’s history).   Or maybe it’s just as simple as saying that the Internet hasn’t yet filled up its entire potential space, which itself is ever-changing as we alter the capacity and complexity of the infrastructure that supports the Internet.

Returning to the matter of measurement, I may assign null values to vector points for new objects to normalize data for large transformations. In other words, I pretend these objects were always in the set from Time Slice 0 – I just treat them as though they were inconsequential parts of the landscape, so to speak.  That’s really only for the convenience of dealing with large sets of rows of data in spreadsheets.

An object in the set cannot be another set.  It must be atomic.  To deal with the complexity of things I have developed a theory of data superstructures that essentially says an object becomes subsumed into any super structure of which it is a part.  The object exists discretely only at its own level of complexity.  Hence, any collection of complex objects becomes a super structure if and only if the collection assumes properties and behaviors that are distinct from the properties and behaviors of any of the objects in the collection.  An atomic object exists only at one level of analysis; it is either a collection of atomic objects in a prior discrete level of (lower, less complex) analysis or it is an intrinsic part of an atomic object in a subsequent discrete level of (higher or more complex) analysis.   This implies that complexity may be infinite in either direction but perhaps there are limits to the complexity of data superstructures.

To illustrate this principle, take a lot of shoes and pile them up together and they don’t behave like shoes.  You cannot wear the pile.  If you add some glue to the mix you create something new (Art, according to some people).  The glued-together pile of shoes is a superstructure with its own properties and characteristics distinct from the properties and characteristics of shoes.

In my August 10, 2011 Science 2.0 article “Reflections of the Realized Imagination” I wrote: “Imagine that every day whole galaxies vanish and whole galaxies appear in different parts of the universe.  Some of these galaxies allow us to look into them; other galaxies obscure their stars from outside observation….In this chaotic real-time universe where things pop into and out of existence other things are morphing.  What was a spiral galaxy yesterday may morph into a protogalaxy – a mass of unshaped gas – today….”  Does that sound familiar?

My work has led me to develop a simple theory of Morphing Sets because that is the only way I can make sense of the chaos in the ever-changing data I observe.  And yet merely looking at the sets at any given time limits my ability to analyze what is happening, so I divided the sets into Natural and Optimized (Transparently&Opaquely so) things, and then I measured the changes in Naturality across timelines to determine rates of change and predict where the trend lines correlate with each other.

All of which led me to realize that Naturality is not a static thing.   It changes over time.  It’s also increasingly complex as you retract your point of view and look at ever larger sets of data, and given enough data a closed system seems to reorganize itself into more complex higher systems or data superstructures.