A spectre haunts the world’s intellectual elites: information overload. Ordinary people have hijacked strategic resources and are clogging up once carefully policed media channels. Before the Internet, the mandarin classes rested on the idea that they could separate “idle talk” from “knowledge”. With the rise of Internet search engines it is no longer possible to distinguish between patrician insights and plebeian gossip. The distinction between high and low, and their co-mingling on occasions of carnival, belong to a bygone era and should no longer concern us. Nowadays an altogether new phenomenon is causing alarm: search engines rank according to popularity, not truth. Search is the way we now live. With the dramatic increase of accessed information, we have become hooked on retrieval tools. We look for telephone numbers, addresses, opening times, a person’s name, flight details, best deals and in a frantic mood declare the ever growing pile of grey matter “data trash”. Soon we will search and only get lost. Old hierarchies of communication have not only imploded, communication itself has assumed the status of cerebral assault. Not only has popular noise risen to unbearable levels, we can no longer stand yet another request from colleagues and even a benign greeting from friends and family has acquired the status of a chore with the expectation of reply. The educated class deplores that fact that chatter has entered the hitherto protected domain of science and philosophy, when instead they should be worrying about who is going to control the increasingly centralized computing grid.
What today’s administrators of noble simplicity and quiet grandeur cannot express, we should say for them: there is a growing discontent with Google and the way the Internet organizes information retrieval. The scientific establishment has lost control over one of its key research projects – the design and ownership of computer networks, now used by billions of people. How did so many people end up being that dependent on a single search engine? Why are we repeating the Microsoft saga once again? It seems boring to complain about a monopoly in the making when average Internet users have such a multitude of tools at their disposal to distribute power. One possible way to overcome this predicament would be to positively redefine Heidegger’s Gerede. Instead of a culture of complaint that dreams of an undisturbed offline life and radical measures to filter out the noise, it is time to openly confront the trivial forms of Dasein today found in blogs, text messages and computer games. Intellectuals should no longer portray Internet users as secondary amateurs, cut off from a primary and primordial relationship with the world. There is a greater issue at stake and it requires venturing into the politics of informatic life. It is time to address the emergence of a new type of corporation that is rapidly transcending the Internet: Google.
The World Wide Web, which should have realized the infinite library Borges described in his short story The Library of Babel (1941), is seen by many of its critics as nothing but a variation of Orwell’s Big Brother (1948). The ruler, in this case, has turned from an evil monster into a collection of cool youngsters whose corporate responsibility slogan is “Don’t be evil”. Guided by a much older and experienced generation of IT gurus (Eric Schmidt), Internet pioneers (Vint Cerf) and economists (Hal Varian), Google has expanded so fast, and in such a wide variety of fields, that there is virtually no critic, academic or business journalist who has been able to keep up with the scope and speed with which Google developed in recent years. New applications and services pile up like unwanted Christmas presents. Just add Google’s free email service Gmail, the video sharing platform YouTube, the social networking site Orkut, GoogleMaps and GoogleEarth, its main revenue service AdWords with the Pay-Per-Click advertisements, office applications such as Calendar, Talks and Docs. Google not only competes with Microsoft and Yahoo, but also with entertainment firms, public libraries (through its massive book scanning program) and even telecom firms. Believe it or not, the Google Phone is coming soon. I recently heard a less geeky family member saying that she had heard that Google was much better and easier to use than the Internet. It sounded cute, but she was right. Not only has Google become the better Internet, it is taking over software tasks from your own computer so that you can access these data from any terminal or handheld device. Apple’s MacBook Air is a further indication of the migration of data to privately controlled storage bunkers. Security and privacy of information are rapidly becoming the new economy and technology of control. And the majority of users, and indeed companies, are happily abandoning the power to self-govern their informational resources.
The art of asking the right question
My interest in the concepts behind search engines was raised again while reading a book of interviews with MIT professor and computer critic Joseph Weizenbaum, known for his 1966 automatic therapy program ELIZA and his 1976 book Computer Power and Human Reason. Weizenbaum died on 5 March 2008 at the age of 84. A few years ago Weizenbaum moved from Boston back to Berlin, the city where he grew up before escaping with his parents from the Nazis in 1935. Especially interesting are Weizenbaum’s stories about his youth in Berlin, the exile to the USA and the way he became involved in computing during the 1950s. The book reads like a summary of Weizenbaum’s critique of computer science, namely that computers impose a mechanistic point of view on their users. What especially interested me is the way in which the “heretic” Weizenbaum shapes his arguments as an informed and respected insider – representing a position similar to the “net criticism” that Pit Schultz and I have been developing since we started the “nettime” project in 1995.
The title and subtitle of the book sound intriguing: “Where are they, the islands of reason in the cyber sea? Ways out of the programmed society”. Weizenbaum’s system of belief can be summarized something as, “Not all aspects of reality are predictable”. Weizenbaum’s Internet critique is a general one. He avoids being specific, and we have to appreciate this. His Internet remarks are nothing new for those familiar with Weizenbaum’s oeuvre: the Internet is a great pile of junk, a mass medium that consists of up to 95 per cent nonsense, much like the medium of television, in which direction the Web is inevitably developing. The so-called information revolution has flipped into a flood of disinformation. The reason for this is the absence of an editor or editorial principle. The book fails to address why this crucial media principle was not built-in by the first generations of computer programmers, of which Weizenbaum was a prominent member. The answer probably lies in the computer’s initial employment as a calculator. Techno determinists in Berlin’s Sophienstrasse and elsewhere insist that mathematical calculation remains the very essence of computing. The (mis)use of computers for media purposes was not foreseen by mathematicians, and today’s clumsy interfaces and information management should not be blamed on those who designed the first computers. Once a war machine, it will be a long and winding road to repurpose the digital calculator into a universal human device that serves our endlessly rich and diverse information and communication purposes.
On a number of occasions I have formulated a critique of “media ecology” that intends to filter “useful” information for individual consumption. Hubert Dreyfus’s On the Internet (2001) is one the key culprits. I do not believe that it is up to any professor, editor or coder to decide for us what is and what is not nonsense. This should be a distributed effort, embedded in a culture that facilitates and respects difference of opinion. We should praise the richness and make new search techniques part of our general culture. One way to go would be to further revolutionize search tools and increase the general level of media literacy. If we walk into a bookstore or library our culture has taught us how to browse through the thousands of titles. Instead of complaining to the librarian or inform the owners that they carry too many books, we ask for assistance, or work it out ourselves. Weizenbaum would like us to distrust what we see on our screens, be it television or the Internet. Weizenbaum fails to mention who is going to advise us what to trust, whether something is truthful or not, or how to prioritize the information we retrieve. In short, the role of the mediator is jettisoned in favour of cultivating general suspicion.
Let’s forget Weizenbaum’s info-anxiety. What makes the interview such an interesting read is its insistence on the art of asking the right question. Weizenbaum warns against an uncritical use of the word “information”. “The signals inside the computer are not information. They are not more than signals. There is only one way to turn signals into information, through interpretation”. For this we depend on the labour of the human brain. The problem of the Internet, according to Weizenbaum, is that it invites us to see it as a Delphic oracle. The Internet will provide the answer to all our questions and problems. But the Internet is not a vending machine in which you throw a coin and then get what you want. The key, here, is the acquisition of a proper education in order to formulate the right query. It’s all about how one gets to pose the right question. For this one needs education and expertise. Higher standards of education are not attained by making it easier to publish. Weizenbaum: “The fact that anyone can put anything online does not mean a great deal. Randomly throwing something in achieves just as little as randomly fishing something out.” Communication alone will not lead to useful and sustainable knowledge.
Weizenbaum relates the uncontested belief in (search engine) queries to the rise of the “problem” discourse. Computers were introduced as “general problem solvers” and their purpose was to provide a solution for everything. People were invited to delegate their lives to the computer. “We have a problem”, argues Weizenbaum, “and the problem requires an answer”. But personal and social tensions cannot be resolved by declaring them a problem. What we need instead of Google and Wikipedia is the “capacity to scrutinize and think critically”. Weizenbaum explains this with reference to the difference between hearing and listening. A critical understanding requires that we first sit down and listen. Then we need to read, not just decipher, and learn to interpret and understand.
As you might expect, the so-called Web 3.0 is heralded as the technocratic answer to Weizenbaum’s criticism. Instead of Google’s algorithms based on keywords and an output based on ranking, soon we will be able to ask questions to the next generation of “natural language” search engines such as Powerset. However, we may assume that computational linguists will be cautious about acting as a “content police force” that decides what is and what is not crap on the Internet. The same counts for Semantic Web initiatives and similar artificial intelligence technologies. We are stuck in the age of web information retrieval. Whereas Google’s paradigm was one of link analysis and page rank, next generation search engines will become visual and start indexing the world’s image, this time not based on the tags that users have added but on the “quality” of the imagery itself. Welcome to the hierarchisation of the real. The next volumes of computer user manuals will introduce programmer geeks to aesthetic culture. Camera club enthusiasts turned coders will be the new agents of bad taste.
Ever since the rise of search engines in the 1990s we have been living in the “society of the query”, which, as Weizenbaum indicates, is not far removed from the “society of the spectacle”. Written in the late 1960s, Guy Debord’s situationist analysis was based on the rise of the film, television and advertisement industries. The main difference today is that we are explicitly requested to interact. We are no longer addressed as an anonymous mass of passive consumers but instead are “distributed actors” who are present on a multitude of channels. Debord’s critique of commodification is no longer revolutionary. The pleasure of consumerism is so widespread that it is has reached the status of a universal human right. We all love the commodity fetish, the brands, and indulge in the glamour that the global celebrity class performs on our behalf. There is no social movement or cultural practice, however radical, that can escape the commodity logic. No strategy has been devised to live in the age of the post-spectacle. Concerns have instead been focusing on privacy, or what’s left of it. The capacity of capitalism to absorb its adversaries is such that, unless all private telephone conversations and Internet traffic became were to become publicly available, it is next to impossible to argue why we still need criticism – in this case of the Internet. Even then, critique would resemble “shareholder democracy” in action. The sensitive issue of privacy would indeed become the catalyst for a wider consciousness about corporate interests, but its participants would be carefully segregated: entry to the shareholding masses is restricted to the middle-classes and above. This only amplifies the need for a lively and diverse public domain in which neither state surveillance nor market interests have a vital say.
Stop searching, start questioning
In 2005 the president of the French Biliothèque National, Jean-Noël Jeanneney, published a booklet in which he warned against Google’s claim to “organize the world’s information”. Google and the Myth of Universal Knowledge remains one of the few documents that openly challenge Google’s uncontested hegemony. Jeanneney targets only one specific project, Book Search, in which millions of books of American university libraries are being scanned. His argument is a very French-European one. Because of the unsystematic and unedited manner by which Google selects the books, the archive will not properly represent the giants of national literature such as Hugo, Cervantes and Goethe. Google, with its bias of English sources, will therefore not be the appropriate partner to build a public archive of the world’s cultural heritage. “The choice of the books to be digitized will be impregnated by the Anglo-Saxon atmosphere”, writes Jeanneney.
While in itself a legitimate argument, the problem is that Google is not interested in creating and administering an online archive in the first place. Google suffers from data obesity and is indifferent to calls for careful preservation. It would be naive to demand cultural awareness. The prime objective of this cynical enterprise is to monitor user behaviour in order to sell traffic data and profiles to interested third parties. Google is not after the ownership of Emile Zola; its intention is to lure the Proust lover away from the archive. Whereas for the French, Balzac’s collected works are the epiphany of French language and culture, for Google they are abstract data junk, a raw resource whose sole purpose it is to make profit. It remains an open question if the proposed European answer to Google, the multi-media search engine Quaero, will ever become operational, let alone embody Jeanneney’s values. By the time of Quaero’s launch, the search engine market will be a generation ahead of Quaero in media and device capabilities; some argue that Jacques Chirac was more interested in maintaining French pride than the global advancement of the Internet.
It is no great surprise that Google’s fiercest critics are North Americans. So far, Europe has invested surprisingly little of its resources into the conceptual understanding and mapping of new media culture. At best, the EU is the first adaptor of technical standards and products from elsewhere. But what counts in new media research is conceptual supremacy. Technology research alone will not do the job, no matter how much money the EU invests in future Internet research. As long as the gap between new media culture and major governing, private and cultural institutions is reproduced, a thriving technological culture will not be established. In short, we should stop seeing opera and the other beaux artes as compensation for the unbearable lightness of cyberspace. Besides imagination, collective will and a good dose of creativity, Europeans could mobilize their unique quality to grumble into a productive form of negativity. The collective passion for reflection and critique could be used to overcome the outsider syndrome many feel in their assigned role as mere users and consumers.
Jaron Lanier wrote in his Weizenbaum obituary: “We wouldn’t let a student become a professional medical researcher without learning about double blind experiments, control groups, placebos and the replication of results. Why is computer science given a unique pass that allows us to be soft on ourselves? Every computer science student should be trained in Weizenbaumian scepticism, and should try to pass that precious discipline along to the users of our inventions”. We have to ask ourselves: why are the best and most radical Internet critics US Americans? We can no longer use the argument that they are better informed. My two examples, working in Weizenbaum’s footsteps, are Nicolas Carr and Siva Vaidhyanathan. Carr comes from the industry (Harvard Business Review) and is the perfect insider critic. His recent book, The Big Switch, describes Google’s strategy to centralize, and thus control, the Internet infrastructure through its data-centre. Computers are becoming smaller, cheaper and faster. This economy of scale makes it possible to outsource storage and applications at little or no cost. Businesses are switching from in-house IT departments to network services. There is an ironic twist here. Generations of hip IT gurus cracked jokes about the IBM head Thomas Watson’s prediction that the world only needed five computers – yet this is exactly the trend. Instead of further decentralizing, Internet use is concentrated in a few, extremely energy demanding data-centres. Carr ignores the greed of the dotcom-turned-Web 2.0 class and instead specializes in amoral observations of technology. Siva Vaidhyanathan’s project, The Googlization of Everything, aims to synthesize critical Google research into a book due to come out in 2009. In the meantime, he collects the raw material on one of his blogs.
For the time being we will remain obsessed with the diminishing quality of the answers to our queries – and not with the underlying problem, namely the poor quality of our education and the diminishing ability to think in a critical way. I am curious whether future generations will embody – or shall we say design – Weizenbaum’s “islands of reason”. What is necessary is a reappropriation of time. At the moment there is simply not enough of it to stroll around like a flaneur. All information, any object or experience has to be instantaneously at hand. Our techno-cultural default is one of temporal intolerance. Our machines register software redundancy with increasing impatience, demanding that we install the update. And we are all too willing to oblige, mobilized by the fear of slower performance. Usability experts measure the fractions of a second in which we decide whether the information on the screen is what we are looking for. If we’re dissatisfied, we click further. Serendipity requires a lot of time. We might praise randomness, but hardly practice this virtue ourselves. If we can no longer stumble into islands of reason through our inquiries, we may as well build them ourselves. With Lev Manovich and other colleagues I argue that we need to invent new ways to interact with information, new ways to represent it, and new ways to make sense of it. How are artists, designers, and architects are responding to these challenges? Stop searching. Start questioning. Rather than trying to defend ourselves against “information glut”, we can approach this situation creatively as the opportunity to invent new forms appropriate for our information-rich world.
Thanks to Ned Rossiter for his editorial assistance and ideas.