Limits, enterprises, and accountability of cultural software
A recent ad campaign promoted by UN and created to warn about worldwide spread of sexist and discriminatory opinions regarding female population has been based on Google searching functions.
More precisely, it focuses on suggestions furnished by autofilling function after some words as “women should …” have been introduced. Both (supposedly) based on frequently searching operations and recognized textual combinations of internet contents, the algorithm showed all stereotyped ideas running in the world. In ad campaign women are shown with a sort a gag represented by Google’s suggestions that, covering their mouths, have the effect to silence true voices and personalities. The success of campaign (credits: Memac Ogilvy & Mather Dubai) to reveal such prejudices in a very incisive way and so oddly timely – if we think of our post-modernity so sophisticatedly supported in terms of technologies of thought – can be measured by the huge debate on twitter (hashtag: #womenshould) but also by imitations: see Australian initiative Racism. It Stops With Me.
In the past this online space hosted a reflection on the importance of language to structure human being in the continuous process of interaction with environment – so, not alone in terms of symbolic, cultural and ethnic aspects. Among other arguments, it has been said how “speaking being is found to receive, unique among diverse species, an effectuality made of speech, generated as who is ‘spoken’, still before being speaking”. The concern about what we have left in terms of constituted/spoken materials is well justified in front of the advent of these new prostheses of mediation intervening before or, even, in the same moment we try to formulate a thought. Anne Jobin is a Swiss scholar particularly attracted by evolution of social phenomenons regarding adoption of online technologies (just-in-time-sociology). She notes how searching suggestion can immediately push, beyond our intentions, along the proposed paths, feeding at least a double risk: 1) attribution of veracity; 2) help to reinforce stereotypes.
On the other side, Google does not explain well enough its algorithms and, moreover, it will have some problem to deny that in some countries its code works together with that of local authorities focused to control free expression of their populations. Google hides itself behind the objectivity of algorithms, but can this position exempt us to find responsibly any solutions?
“If we, as a society, do not want negative stereotypes (be they sexist, racist, ablist or otherwise discriminatory) to prevail in Google’s autocompletion, where can we locate accountability? With the people who first asked stereotyping questions? With the people who asked next? Or with the people who accepted Google’s suggestion to search for the stereotyping questions instead of searching what they originally intended? What about Google itself? … Of course, algorithms imply automation. And digital literacy helps understanding the process of automatation … but Algorithms are more than a technological issue: they involve not only automated data analysis, but also decision-making.” (Jobin, 2013).
The conclusion is to go beyond binary position of intentionality or complete innocence, to approach the topic as a complex issue, even because, Jobin affirms, who is in charge when algorithms are in charge? After the infinite spying affair organized by NSA, in which internet has been a target of an indecent raid to take every sort of data in every kind of application thanks to automatizable (and so automized) technologies, the topic of ethic responsibility is nervously emerging in the engineering arena. The pervasiveness of software applications inside daily lives of billion people starts to require, as for health and justice matter, not only skills to obtain effective and reliable programs but also a preventive ethical exam. Nowadays, it is insufficient to give for granted that a technological device can be used for good or, unfortunately, for bad.
“Engineers have, in many ways, built the modern world and helped improve the lives of many. Of this, we are rightfully proud. What’s more, only a very small minority of engineers is in the business of making weapons or privacy-invading algorithms. However, we are part and parcel of industrial modernity with all its might, advantages and flaws, and we we therefore contribute to human suffering as well as flourishing. … It will be a bright day for our profession when we start producing more engineers who … have the will and the intellectual capacity to engage with bigger questions about the ethics, politics and social ramifications of their inventions. ” (El-Zein, 2013).
To have a more precise picture, we should not consider a peculiar angle of issue because now particularly lighted by media. A recent study tell us that over 60% of internet traffic is not-human: the Net is full of algorithms that work to generate/face every sort of problem, even to recovery some principle of order or utility. For example, to remove comments or reviews posted to alter attempts to really measure, through personal feedbacks, offered services or to tendentiously feed debates.
“Algorithms are becoming ever more important in society, for everything from search engine personalization, discrimination, defamation, and censorship online, to how teachers are evaluated, how markets work, how political campaigns are run, and even how something like immigration is policed. Algorithms, driven by vast troves of data, are the new power brokers in society, both in the corporate world as well as in government.They have biases like the rest of us. And they make mistakes. But they’re opaque, hiding their secrets behind layers of complexity.” (Diakopoulous, 2013).
We could be satisfied by transparency reports published by some internet providers that show requests asked them by private people and public authorities to “regulate”, in some way, their services. But it can be well suggested to require accountability of algorithms, trying to understand better their behavior (reverse engineering).
“Algorithms are essentially black boxes, exposing an input and output without betraying any of their inner organs. You can’t see what’s going on inside directly, but if you vary the inputs in enough different ways and pay close attention to the outputs, you can start piecing together some likeness for how the algorithm transforms each input into an output. The black box starts to divulge some secrets… given the growing power that algorithms wield in society it’s vital to continue to develop, codify, and teach more formalized methods of algorithmic accountability.” (Diakopoulous, 2013).
A good example about how we mix ourself with cultural software regards our video entertainment. A notorious analyst, media historian and professor at Columbia Law School, Tim Wu, wrote recently about video consumption in US – but, indeed, trend involves all worldwide people after the advent of broadband access. More specifically, video streaming deliverable via internet activating a monthly subscription reinforces the behavior to build a highly personalized palimpsest that can contain an abnormal quantity of every kind of video content, now also original and tailored ones because produced for their own audiences.Although television continues to have high audience, the way to organize and offer entertainment by these providers – first of all, Netflix – remarks an epochal change in terms of American cultural consumption. Among the most controversial aspects there is the lack of a wide sense of community by which rites of mass consumption involved and united people..
“But it’s not all cause for dismay. Community lost can be community gained, and as mass culture weakens, it creates openings for the cohorts that can otherwise get crowded out. When you meet someone with the same particular passions and sensibility, the sense of connection can be profound. Smaller communities of fans, forged from shared perspectives, offer a more genuine sense of belonging than a national identity born of geographical happenstance.” (Wu, 2013).
The analysis of Netflix – the American company became synonymous of video on demand having 40 millions of worldwide subscribers (31 millions in US) – marvels Wu for its ability to catch the trend using masterfully the medium “internet”. The last moves regarding original content production – it invests hundreds of millions $ while gained only 17 millions $ in 2012 – are only the realization of their origin aims. The founders, people passionate and expert of movies, want to promote a high-tech company whose products are video content desired and/or made known from/to its subscribers, of which it also follows and prompts the most singular idiosyncrasies.
“If modern American popular culture was built on a central pillar of mainstream entertainment flanked by smaller subcultures, what stands to replace it is a very different infrastructure, one comprising islands of fandom. ” (Wu, 2013).
For Netflix, the implementation of a challenging and flexible, global streaming infrastructure was only a first enabling move – however imitated from almost all big media. But, meanwhile, it hardly worked to read, as somebody says, “the American soul”, putting in place a sort of reverse engineering of Hollywood.
“If you use Netflix, you’ve probably wondered about the specific genres that it suggests to you. Some of them just seem so specific that it’s absurd. Emotional Fight-the-System Documentaries? Period Pieces About Royalty Based on Real Life? Foreign Satanic Stories from the 1980s? If Netflix can show such tiny slices of cinema to any given user, and they have 40 million users, how vast did their set of “personalized genres” need to be to describe the entire Hollywood universe? This idle wonder turned to rabid fascination when I realized that I could capture each and every microgenre that Netflix’s algorithm has ever created. Through a combination of elbow grease and spam-level repetition, we discovered that Netflix possesses not several hundred genres, or even several thousand, but 76,897 unique ways to describe types of movies.” (Madrigal, 2014).
The conclusion is that Netflix has cataloged and described every movies and tv programs, filling a unique and huge database with which it help itself to generate, in combination with data based on subscriber usage behavior, very personalized categories. In an interview with the creator of this deconstructive work, Todd Yellin has explained how many people have worked on the project, a team specially trained to describe and valuate contents adding every kind of metadata, including narrative elements and plot conclusiveness.
“They capture dozens of different movie attributes. They even rate the moral status of characters. When these tags are combined with millions of users viewing habits, they become Netflix’s competitive advantage. The company’s main goal as a business is to gain and retain subscribers. And the genres that it displays to people are a key part of that strategy. “Members connect with these [genre] rows so well that we measure an increase in member retention by placing the most tailored rows higher on the page instead of lower … The better Netflix shows that it knows you, the likelier you are to stick around. And now, they have a terrific advantage in their efforts to produce their own content: Netflix has created a database of American cinematic predilections. The data can’t tell them how to make a TV show, but it can tell them what they should be making. When they create a show like House of Cards, they aren’t guessing at what people want.” (Madrigal, 2014).
Having the programmatic availability of such data in their regeneration through users interaction can mean to be able to easily extrapolate useful information for any sort of scope – Madrigal shows us some tables using the data extracted in his reverse engineering work. However, a powerful process of interpenetration and adaptation between human and machinic systems is already (on a daily base) in progress. Humans and machines try to resolve or manage these complex issues, going on with means that still add complexity and, consequently, possibilities or results often not immediately understandable. Answering to a question about a result produced by the algorithm that seems very contradictory respect to the common sense, Yellin admits that adding complexity means also to navigate in an indeterminateness that, however, could bring some fruits in terms of new solutions. Using his words “sometimes we call it a bug and sometimes we call it a feature.”. However, as we previously observed in terms of accountability of algorithm, and thinking of all the mechanisms that continuously are entering into the new boundaries between real and virtual world, it should be important that on the merit of solution can be involved, considering costs and benefits in a very wide and aware way, all the co-participants.
Diakopoulos, N., 2013, “Rage Against the Algorithms“, The Atlantic, 3/10.
El-Zein, A., 2013, “As engineers, we must consider the ethical implications of our work”, The Guardian, 3/12.
Jobin, A., 2013, “Google’s autocompletion: algorithms, stereotypes and accountability“, sociostrategy.com.
Madrigal, A. C., 2014, “How Netflix Reverse Engineered Hollywood”, The Atlantic.com, 2/1, <>
“Report: Bot traffic is up to 61.5% of all website traffic”, incapsula.com, 9/12/2013.
Wu, T., 2013, “Netflix’s war on mass culture“, New Republic.com, 3/12.