Questions about clouds
Since XX century ICTs demonstrated to be able to spring changes in almost every sectors of society. Meanwhile, we have had a quite imprecise ideas about resources and tools enforcing their power, a fact that helps the rise of a mythological aura around them. On the other side, we were increasingly involved by these machines even only as simple users. Indeed, ICTs involved mass-users only with the internet of 2000s, both infiltrating their tools and languages into common life and requiring some technical alphabetisation.
More precisely, the democratisation of computers in terms of real availability and ability to friendly manage and program began for an increasing group of people in the 80s. They started a process that spreads ICT knowledge/resourses and their subtraction to public and private monopolies. The diffusion of personal computers resulted an indispensable prerequisite for what we now define “software culture”. Initially developed as mere platform of data elaboration, networked personal computers became parts of a communicative territory of proposal, elaboration and aggregation of individual and group projects where hw/sw procedures/materials combine continuously each other to feed new forms of languages and cultures.
As the lawyer and economist Yochai Benkler well explained, technological and legal constrains in terms of freedom and ability to manage and program ICT resources are determinant to design the possibility of new economy and new forms of communication and expression. What once was possible only centralizing competence and resources using huge capital, it can be otherwise obtained with a re-aggregation of the same fragmented/distributed components in a cooperative way . Then, the process redefines the terms of production and consume into the many fields where data elaboration, memorization and transmission are essential. These changes involve deep effect on all those centres/powers that influence the ability to accumulate huge resources in order to build and manage this kind of services. Among them there are media, a vital sector for its function to create and maintain the circulation of ideas and news.
Re-storation
At the beginning of new millennium, a series of continuous technological progresses is changing the very way to concept, design and delivery ICT stuff. The incredible increment of data capacity on wireless and cable systems at the core and edge of networks causes an enormous re-organization of ICT resources/services.
The increasingly power of network links not only brings instantaneous data exchanges but even a new organization of ICT basic components where computers and storage place into big and remote silos remaining easily reachable for worldwide (even mobile) users. As in the case of electric supply, where centralized plants have substituted autarchic user generators once grid was available (a historical analogy frequently recalled), computers and memories are now be able to dispatch “information energy” throughput our broadband sockets. Briefly, these new architectures get a complete de-materialization of computer services becoming not only negotiable but even completely tradable through the network, the apotheosis of a pure economy of access (Rifkin, 2000).
Paradoxically, even the computer, a virtualizer for excellence, was subjected to the destiny Marx intuited for many devices of modern realm that – behind the incessant work of processes of abstraction typical of complex societies – transforms “All that is solid melts into air” (Marx, 1848; cfr. Berman, 1982).
In effect, the new philosophy of computing has the clouds as main metaphor. Its image evokes lightness and ubiquitous presence about services delivered by a distributed architecture that doesn’t require some sort of individual investment in terms of knowledge. It should free us from any ICT incumbencies for a fraction of total ownership cost in order to develop our real interests. The new paradigm has many impacts: on user side, for the modality of delivery and utilize services/resources; on infrastructure, for the ownership, organization and management of hw/sw goods; on service side, for the creation, development and operation of applications.
However, such development begins to rise many questions about a return to an old, centralized architecture, a very different path from all novelties – first of all, openness and freedom to elaborate services – brought by Net. To be clear, we are not victims of a pre-organized plan to restore old dominances. Rather, we are living a condition in which many ICT actors try to consolidate its own position exploiting some global trends: the economic crises and the huge internet success in terms of users, groups and business expectations or the natural dynamics of monopolies based on network externalities, as well as the scarce success of alternative business models than advertising.
On the other hand, critical questions come along with the rising acknowledgment about the intrinsic values that digital technologies represent for every human spheres. As noticed by the same inventor of web Tim Berners-Lee in a recent and alarmed interview about the logic of walled garden and application asymmetry between centre and hedges, the success of web application risks to undermine internet although web is only one of its possible applications.
In a way, it is reasonable to think that current developments are a logical, next step of the typical client-server architecture of web. It was finally extended beyond the classic linking and publishing of static contents, involving the basic needs of computing/storage operations. So, we can see now the major ICT players address two kinds of strategy: centralizing computing and storage; implementing all techniques capable to streamline graphic rendering/computing of real-time complex formats (text/image/audio-video) flowing in user terminals.
There is a huge rush to elaborate user-friendly and neutral graphic-user interfaces working on different personal equipments ready to accept services that super-computers of Net delivery – news, searching, editing, computing, maps, remote storage folders, etc. Preferably, they should consume little resources of personal devices since the main workload and complexity have been transferred to the centre which manages hw/sw stuff, updates and maintains data and application, and even bears the delicate task to create/develop new features.
Yet, it should be reductive to not mention the role played by the demand and solutions for elaborating services that are increasingly articulated, ubiquitous and global.
Dis-semination
Nowadays almost every kind of information has finished or is going to fall into some sort of cloud. This is certainly true for consumers while companies are more diffident to give their data and applications in outsourcing for privacy reasons. However, being on some social network means to utilize cloud technologies, as well as having mail account such Gmail, Yahoo, Hotmail, etc. or using micro-blogging as Twitter, blog as WordPress, video-sharing as Youtube or DaylyMotion, but also posting or looking at photos on Flickr, ranking on Yelp and TripAdvisor, using docs and applications such Google Docs, sites for social-bookmarking as Delicious or e-commerce platforms as eBay.
The American institute of research PEW – specialized to inquire Net activities of American citizens – issued recently an essay on cloud technology involving many ICT experts (Pew, 2010). Underlining the effectiveness and sophistication of new services in using remote and local resources, PEW notes how the core of future services will inevitably shift toward an external centre that works as an “internet operating system”. The fact that applications hosted by our personal devices use local resources (computing, data, sensors) is a small part of process. «It’s easy to think that it’s the sensors in your device – the touch screen, the microphone, the GPS, the magnetometer, the accelerometer – that are enabling their cool new functionality.
But really, these sensors are just inputs to massive data subsystems living in the cloud. When, for example, as an iPhone developer, you use the iPhone’s Core Location Framework to establish the phone’s location, you aren’t just querying the sensor, you’re doing a cloud data lookup against the results, transforming GPS coordinates into street addresses, or perhaps transforming WiFi signal strength into GPS coordinates, and then into street addresses.
When the Amazon app or Google Goggles scans a barcode, or the cover of a book, it isn’t just using the camera with onboard image processing, it’s passing the image to much more powerful image processing in the cloud, and then doing a database lookup on the results. Increasingly, application developers don’t do low-level image recognition, speech recognition, location lookup, social network management and friend connect. They place high level function calls to data-rich platforms that provide these services» (The State of the Internet Operating System, 2010).
So, the implementation of new services resembles to a patchwork links different and indipendent domains once these decide to not close own doors (see walled garden policy) to share activities/data generated by their users. A such kind of services is possible only through both federation and powerful data centers. Yet, the lack of individual control on whole architecture remains the main question point. The risk has been warned many times but nobody thought it could be happened involving very popular services supported by big players. Inside high-tech sector crises can suddenly arrive – for competitive reasons or failures of alternative business models to the classic “free” supported by advertising.
The probability to failure is very high for small company not integrated in other synergetic organizations. But there is much surprise when failure involves big commercial entities even if nothing could contrast private decisions about decommission of social network platforms during bad time.
The new fate of Delicious and its sister Yahoo services yields a lesson about cloud computing that is likely familiar to anyone who tracked the rise of SaaS (software as a service) a few years back: If you decide to turn to a third party to host a service for your business, you run the risk of your provider pulling the plug on that service at any moment….. In this case, said lesson is all the more jarring in that we aren’t talking about a fly-by-night provider no one has ever heard of. We’re talking about Yahoo, a well-established Internet denizen(Yahoo’s offloading of Delicious a reminder of cloud risks, 2010).
Another problem regards monopolies that – in a classic network system where information externalities and economics of scale count (benefit to be networked each other using services based on marginal costs) – have many chances to success. Despite of variety and multitude of actors, monopolies seem to involve even internet, at least in two specific points of its value chain. At level of connectivity, gathering users that want to access internet, or at level of contents offering add-value services capable to attract huge audience without concerns about network resources given the internet ability to support end-to-end services – over-the-top logic.
A recent survey finds out that 30 internet companies generate 30% of global internet traffic. Only two years ago the number of web companies that generated 50% of Net traffic was 5.000 – the same percentage of traffic is now managing from 150 companies (Atlas Internet Observatory 2009 Annual Report, 2010).
The current situation could be linked to the success of cloud architecture because high levels of data traffic hide big web farms at server side. The maintenance and expansion of new services require huge investments – to say, last data centers developed by Microsoft or Apple have a budget of billions of dollars. Moreover, the actual economic pressure is so high that analysts forecast many operations of merging among indebted providers that risk to miss these mass-market dimensions (The danger of the coming ‘big cloud’ monopolies, 2010).
Google story
We end these brief reflections trying to better understand the genesis and force of the new forms of concentration that, although the sudden evidence, have had much time for growing up. Google’s story will be our guide. In effect, Google represents the ideal-type of network computing philosophy having developed from the early internet both ICT knowledge and infrastructures thought and implemented in a distributed and, at the same time, unitarian logic. It started collecting and indexing the increasingly amount of internet contents in order to permit a fast search services.
Through the building of search functions Google gained a vast audience capturing big shares of online advertising market, the very stabilized business model of internet as well as of many cultural industries. Yet, the ability to synchronically organize, coordinate and manage a huge amount of hw/sw resources in an interrelate and modular way served also to develop and support other new services as web mail, maps, photo sharing, news, video, docs, storage, etc., improving the economics of scale of its infrastructures. With a such panoply of resources Google continues to feed an unstoppable cycle of innovative projects advancing even only through the simple method of “trial and error” – in the words of technologist Clay Shirky, that speaks explicitly of quality that grows from quantity, with the method of «publish then filter» (2008).
While we used to acquaint with the simplicity of its portal interface, Google built a powerful and sophisticated network of infrastructures around the world. Utilizing the same basic technologies of normal users – but its PCs run only open source software to avoid royalties – it assembles over 1 millions of servers filled into many networked silos distributed across the globe, strictly protected by no-disclosure policy for marketing and security reasons. The IT expert and critic Nicholas Carr makes the parallel with the production of power supply: as the modern nuclear power station, these super-computers pump data and applications to millions of houses and offices (2008a). Industries and users find convenient to have cloud services without bearing the total ownership cost of hw-sw devices paying the real utilization at one tenth of their costs.
For Carr the centralization of ICT resources resembles the end of autarchic production happened 100 years ago with the distribution of power supply thought grid. The big plants of few electrical companies operating on large scales succeed to supply energy paid on consumption (2008b). Ready to enlarge the distribution and consumption also to households, and starting a crescent spiral of social utility, the big plants offered energy at very low incremental costs. Indeed, the parallel in terms of propellant between electrical energy and information is an effective metaphor of our post-industrial age.
Of course, in line with every epochal passage there are many warred thoughts about the articulation of changes because the re-modulation involve not only industry and trade but even “entertainment, journalism, education, even politics and national defence. The shock waves produced by a shift in computing technology will thus be intense and far-reaching. We can already see the early effects all around us – in the shift of control over media from institutions to individuals, in people’s growing sense of affiliation with “virtual communities” rather than physical ones, in debates over the value of privacy, even in the growing concentration of wealth in a small slice of the population (Carr, 2008a).
As last considerations show, the analogy works quite well but seems too much limitative. In effect, information – with its symbolic nature – is never something of neutral. It has a power that declines itself in each phase of its life, engraving and shaping in a deeper way the life of things, people, groups and communities in social system where distance relationships increase and goods are designed, produced and consumed with/by knowledge-based networks/devices.
Bibliography
Atlas, 2010, Atlas Internet Observatory 2009 Annual Report.
Benkler, Y., 2006, La ricchezza della Rete. La produzione sociale trasforma il mercato e aumenta le libertà, Milano, Egea, 2007.
Berman, M., 1982, L’esperienza della modernità, Bologna, Il Mulino, 1985.
Carr, N., 2008a, “A revolution is taking shape“, in The Financial Times, 29 gennaio.
Carr, N., 2008b, Il lato oscuro della rete. Libertà, sicurezza, privacy, Milano, Etas.
Marx, K., Engels, F., 1848, Manifesto del partito comunista, Milano, Mondadori, 1978.
PEW, 2010, The future of cloud computing, 11 giugno.
Rifkin, J., 2000, L’era dell’accesso. La rivoluzione della new economy, Milano, Mondadori.
Shirky, C., 2008, Uno per uno, tutti per tutti. Il potere di organizzare senza organizzazione, Torino, Codice, 2009.
“The State of the Internet Operating System“, 2010, O’Really Radar, 29/3.
“The danger of the coming ‘big cloud’ monopolies“, 2010, InfoWorld, 20/10.
“Yahoo’s offloading of Delicious a reminder of cloud risks“, 2010, InfoWorld, 17/12.