In a network of reputation management systems, one assumes that authority is the dominant metric for choosing a service. Yet over time, there is a tendancy for all services to reach a rough equilibrium in terms of information goodness (validatible and high correctness scores). It is akin to the problem of picking a news cast in any media. If all reputations for correctness are roughly equal, likability plays a strong role in reputation management.
But if mashups are the brave new application world, then the dominant metric is sustainability of the service over some lifecycle of use by similar and different applications. The tendancy of the system to evolve orbits around deep attractors (as in deep pockets) has not changed, thus the naïve but rough long tail economics continues unabated.
Note that while XML is agnostic to these dimensions of the hyperspaces of web information domains, it becomes doubly difficult for any specific-domain language to emerge unless it has high scores for sustainability. I suspect that this is a contributor to the failures of domain specific browsers that include all semantics in a strongly-coupled package of components. Their niche is self-limiting and not ESS-immune given the infecting nature of the loosely-coupled confabulations that attack from the edges of the information ecology. This explains why many weak semantics and apparently flawed strategies for creating applications can thrive in the face of better constructed and well-funded projects initially. If the locals have a rapid replacement rate, they cannot be unhosted.
Intelligence doesn't scale. Yet as a service, if it is tightly focused and sustainable, it cannot be unhosted until commoditized. At that point, its orbit crosses Lagrange points where small perturbations can drive the connection to other systems.
Savvy systems defend resources for resources until they become loss leaders.
No comments:
Post a Comment