The “tyranny” of contents and processes of concentration
As the place of premise, first lines can contain some acknowledgements about the extraordinary meaning that the Net has for a relevant part of worldwide people, namely a treasure of great cultural and intellectual vivacity, of incommensurable usefulness, expressivities and intelligence. But, after this admission, we have to do some reflections on how this whole of contents is physically supported and how network structures are shaping to respond in a sustainable way with respect to its evolutions.
Given the numerous factors involving into the process and their “environmental” dynamics (in internet actions acquire exponential speed), it is never easy to underline the right trends, above all from a quantitative point of view. We can individuate many signals about certain processes of consolidation and a passage to a “2.0” phase more focused on Services as a Software, with utilities covering an extended range including entertainment for leisure time and business tools for labour market, products landing immediately (from a cloud computing) on a mobile/fixed hyper-connected user.
But which kind of reshaping are physical infrastructures and their managing business companies involving in? What is the new state of art of “core” internet? We try to draw a description exploring – among others – a new study by ATLAS Internet Observatory. Analysing the last two years internet traffic, its 2009 Annual Report explains the acceleration of some “core” logics.
Then, our attempt simply tries to underline, with a high level of abstraction, what is behind the sockets and wireless channels we plug in to understand in a systemic way the meaning of these last changes.
Systemic interpretation of internet
We certainly live in a epoch in which human and social destinies/desires are intermingled in and with systemic configurations dealing with a great amount of technical-organizational elements. This is a central issue for social systems theories that have studied the logics of its evolution – to be clear, this term has not a positive meaning but only indicates the sum of next changes. So, sometimes it’s useful to look at them for some indications or even to find cues about a such kind of phenomenon.
One of the most interesting systems theory has been elaborated by sociologist Niklas Luhmann during ’70-’80 years with the aim to approach complexity.
The complexity of modern societies is a consequence of the restless processes of functional differentiation by which system dedicated to resolve certain needs are institutionalised. This dynamics starting when there is the possibility to differ itself from a more complex environment. In brief: the new structure internalises (resolves) a part of complexity, finding in this work of reduction its own functional legitimation, a pre-requisite to establish its own autonomy involving auto-referential codes and procedures. Division and relationships between environment and systems has a temporary balance because the established solution is an actualisation of many alternative possibilities. In effect, the whole flows in the ocean of contingency and faces the inevitable phases of re-adaptation on the basis of restless changes and creation of new opportunities at which it is inevitable exposed – it lives in a constant noise (contradictory and undecodable information), that is also a partial product of refraction of systems and subsystems that do not respond to a unitary principle.
To be clear, the definition of system and environment is relative because each system works as environment, and viceversa. It is a way to bring the reciprocal complexity, even if functional re-orientation happens indirectly (every “institution” is autonomous) and demands a great sense capacity to feel and elaborate signals from internal and, more important, external “noise” on the basis of the existence of structural couplings.
Luhmann’s systems theory has a great heuristic value to explain and explore the logics of evolution of structures responding to social needs even if – we think – it should be update because of the advent of network society. In effect, for example, structural re-orientation has more possibilities given the increasing processes of inter-communication.
Here, more pragmatically, we circumscribe our description on what it happened to internet core in these last years, and systemic theories can be an help to better contextualize incoming changes. In fact, there are many evidences on how functional policy leads structural modifications in the attempt to resolve complexity changing and articulating the previous and simpler configurations.
The long phase of connectivity’s search
Our analysis is based on the interests that various agents have in maintaining a complex and expensive network infrastructure that can exist primarily for economical contributions of institutions and people who pay in terms of interconnection/network capacity.
The figure n. 1 shows original organization where various species of linked Internet Service Provider exchange their data traffic through inter-exchange hubs.
ISPs are the hubs to which all network users are connected. We can divide them into two big categories: End User and Content Provider. Generally, there is a big disparity about the quantity of traffic data they generate and manage: Content Providers originate the most part of data while end users manage a marginal and fractional part of that amount. From economic point of view interconnections among these actors are led by network externality, namely from the reciprocal benefit to be linked.
ISPs typically pay transit links to reach other ISPs that have a bigger groups of users, while, at the same time, extend free links towards similar providers activating the so-called peer-to-peer agreements. Otherwise, peering links tend to be speedily saturated and every next capacity upgrading has to meet the interest of both partners. At the same time, policy of transit circuits is restrictive because of direct costs. Then, network can experience traffic congestions in a situation that suffers an increasing data traffic due to model of consumption, with the generation of sudden spikes of traffic and the preference of sophisticated contents.
In this ecosystem, designed by economical and technical factors, the content providers – increasingly focused on the quality of distribution of audio-visual materials – are very warned about performance. One of elaborated solutions to limit network bottlenecks is to differ links fractionating capacity toward diverse ISPs (Multi-Homing), a solution that increases their contractual force (there are many alternative ISPs) and the control of performance, acting in a schema improves availability in terms of single points of failure. Moreover, content providers replicate and decentralize their contents using Content Delivery Network services. CDN providers build network structures closer to end users offering ICT platforms (dislocated at the edges of network) to host web contents. Once contents are replicated, they are ready to be distributed on the base of geographical place – user request will be dynamically routed toward the nearest server avoiding the possible bottlenecks that can be experienced if contents would arrive from web origin server. Using CDN services ISPs can also avoid to extend multi-homed circuits to all other ISPs.
To summarize, there was a long “1.0” phase led (principally) from a connectivity principle – being connected was the most important thing. In this phase connectivity and traffic price tended to became a commodity, and configuration of actors was quite stratified – meanwhile, the profit margins of standard-quality core traffic are dramatically lowering.
The sophistication of transport
The “2.0” phase is now led by the purpose to gain a better service quality that is changing the “nature” of transport. With a sophistication of services and materials, the functions of connection and traffic convey face a de-commodification based on a series of combined Information Technology services. These value-added services increase quality of data delivery changing transport techniques involving process and resource of “computing and memorization”. Focusing on quality is a consequence of interest on contents – connectivity is taken for granted, a context modifying commercial and engineering models that redefine roles, architectures and actors of this ecosystem. (Added services and higher quality links should compensate lower profit margins of data traffic. At the same time, in a regime of content aggregation, the desires to insert themselves into the advertising business rise).
On the other side, these trends are rising the level of dis-intermediation between network providers and content providers, that can directly manage – even building their own resources – the connections with end users.
But another less surprising tendency is the shrinking of the number of providers controlling the Net. Two years ago the 50% of global internet traffic was originated by about 5,000 companies, now the group involves only 150 ones. Moreover, only 30 of these – including Facebook, Google, and Microsoft – origins 30% of traffic (6% only by Google).
Eventually, we have new findings about the cues we found after analysing “cloud computing” phenomenon, that reminding us to maintain a sufficient attention on new configuration of ICT means. In last years our life, increasingly mixed and embedded with systems, was enriched by ICTs – thanks to a massive social adoption favoured by positive feedbacks of various enablers such availability, flexibility and modularity of resources, low costs, user-friendliness, externalities, etc. Nevertheless, this incredible and delicate socio-technical alchemy – which has been built and managed by a great, heterogeneous group of actors through ideal and practical policies that have guaranteed a certain balance – is facing now deep changes in terms of structures, values and relationships.
Forty years after…..
In the coincidence of forty years celebration we add a last consideration. Internet is, and will remain, a working in progress project, even if the issue would be now extension and deepness of its freedom of purpose. The problem regards the preserving of the ideological “heart” into engineering potentiality (internet “core”), the guarantee of openness that is also the product of a steady and wise grasp on the technological medium from a plurality of radically “independent” people.
P.S.
How well-documented by many historians, internet is the result of the gestation and developments prospered (during four decades) on a dialectics involving military needs and counter-cultural ideologies, control and command issues and anti-hierarchical, de-centralized impulses. Designed in the post-Sputnik period to support military-oriented research and implementations, in the real field project engaged young engineers, scientists and undergraduate students growing into the ’60-’70 “utopical” culture of Californian America where large group of people believed in “open world” ideas hailing anti-war movements and new forms of community and sharing philosophies.
Bibliography
ATLAS Internet Observatory, 2009 Internet Report.
Hau, T., Wulf, J., Zarnekow, R., Brenner, B., 2008, “Economic Effects of Multi Homing and Content Delivery Networks on the Internet” in Proceedings of the 19th ITS European Regional Conference, Rome.
Luhmann, N., 1984, Social Systems, Stanford University Press, Stanford, CA, 1995.