Meditations on the Blip: a review

Meditations on the Blip: a review

Lisette Gonzales
Behind the Blip: Essays on the Culture of Software
Matthew Fuller
Autonomedia, 2004. 176pp.

Lisette Gonzales reviews a book of essays by Matthew Fuller that examines the way we are programmed by software.

Matthew Fuller’s Behind the Blip: Essays on the Culture of Software begins with the proposition that seemingly neutral software applications carry with them cultural values that make them inherently political tools. What seems neutral is actually a tool inflected by software companies that want users to remain passive and interact in a completely prescriptive manner. Fuller finds such a system to be significantly flawed, as it thwarts the possibility of an association based on art or communality and instead encourages one in which workers associate on the basis of the forces of production. And thus, he is interested in that which exists on the periphery of the Internet and those who operate on the edges of corporate viability and artistic integrity. Fuller’s book relentlessly interrogates this power in an effort to undermine it, to examine the unreflective beliefs that we have about computers and to anchor our political action in awareness. Industries configure traditional software toward the ends of commerce and therefore its very form manipulates the user and transforms him into an unknowing consumer, the consequences of which are not inconsiderable. Throughout his book, Fuller voices concern over and contempt for the corporatization of Internet development. Fuller sees a shift from open, collaborative software development (“to build a system that works”) to closed industry consortia whose corporate members are designing the global information infrastructure to serve their own competitive advantage rather than the public good.

Fuller begins with two questions: how do we unleash “the unexpected” and what currents are emerging that incorporate new ways of thinking about software. By “the unexpected,” Fuller means that which does not cohere to the constraints of software companies. These two questions are explored throughout his seven essays dealing both with new software interventions and new ways of thinking about established software programs. Fuller is saying that technology is political, and we as users should not think of it as neutral; instead we should be looking at how things get done and how the software gets written, what it sets up and what it allows. Of course, the problem is in getting people to think in those terms, particularly when they are being well-paid by a corporation who wants them to get done what is in the corporation’s best interest. Such institutions require efficiency based on a business model that disciplines users to this goal. So Fuller seeks ways of voicing dissent when using corporate-sponsored tools, tools that in their very makeup increasingly represent those power structures. Has the Internet become a portal full of thoughts that holds hardly any thoughts based on itself at all? Does control of the medium impact whether the Internet helps or harms its users? How does art impact this issue? Fuller essentially asks us, as both his readers and as software users, whether we can resist submission and therefore redistribute power on the Web.

Fuller classifies three different ways of studying software: human-computer interface, programmer’s self-accounts, and critical theory. The first intervention involves the interaction between the human and the machine. Using the definition posed by Brenda Laurel, Fuller defines interface as “’ a contact surface. It reflects the physical properties of the interactors, the functions to be performed, and the balance of power and control’ ” (99). The prevailing way of understanding this interaction is to interpret the user as an autonomous individual who interacts with the computer. However, as Fuller rightly points out, the computer has more power than one might anticipate. Fuller believes that the computer produces the persona of the user through interaction. The implications of this conception are easy to understand - those who produce software control the users we become.

Fuller continues by defining true interface as impossible. Rather than allowing for the freedom of perceptions and language and the non-determinate processes of chance, the interface is largely a one-way relationship of power and restraint. In order for the human-user interface to make sense, the programmers employ metaphorical devices to help the user cognitively map out the functional capacity of the device (100). In other words, metaphors take a known set of properties from one domain and transfer it to that of the computer as a structuring device. Fuller is interested in this mechanism because the metaphors employed again act as a way to restrain the potential of the Web by making possible only a limited number of ways of interacting with the informational patterns.

To examine the use of interface as a disciplinary tool, Fuller uses an example of a prison computer system that involves a group of prison guards interacting with inmates through a digital interface. Fuller writes,

Whilst the system they instantiate is fundamentally hierarchical, they also operate by means of networks of mutually reinforcing patterns, ideology, structure, and material. What the notion of interface allows us to do here is analyse how they link, how one process passes from the domain of one axiom into another, how processes are reconfigured, stripped down, simplified or made amorphous from their passage from one medial, architectural, racial, juridical regime or another. (104).

The interface acts as a separation that allows the guards to practice a level of brutality that would be unlikely if they had actual physical contact with these prisoners. It illustrates the way a computer interface can function not only as a representational mechanism but also as a modulator of behavior.

The second area of inquiry for Fuller is the accounts produced by programmers - in particular those accounts written by open-source programmers, accounts that chronicle the subversion of proprietary software companies. Proprietary software keeps its coding secret and its operating protocols neither public nor standardized; open-source software has emerged largely as a reaction against the proprietary standard. By writing narratives, programmers assemble a community and invest their products with cultural, social, and political implications. Open-Source narratives portray programmers as representatives of an alliance that opposes the manipulation of consumer preferences by large computer industries.

Along with the issue of corporate control, the format of the Internet is also found to be problematic. Fuller sees the potential of the network as being locked down to imitations of paper-based media, for example the use of the page to indicate a website. Search engines are locked down to this same potential, organized by predictable and finite situations around which search terms cohere. Search engines are not attentive to the marginal uses of information but rather recall information for proprietary purposes in a way that is conceptually normative. Directories classify information with a statistically idealized user in mind, and those who produce these directories are seemingly unconcerned with what is left out by this method. Fuller points out that one way to problematize search engines is to organize matches by absolutely determined irrelevance (83). Like search engines, forms also limit the representation of the user and rely upon stability. Forms cancel the public/private distinction by having the user input private information into a public domain, thereby producing a portrait of the user as the answers to questions become a representation of him (74). Fuller rightly indicates that these questions provide an incomplete portrait, one that reproduces the user in terms of a database of answers; the person you are in terms of information becomes more real than anything in the world of physical matter.

Fuller examines three types of software that represent useful interventions into software studies: critical software, social software, and speculative software. The first of these, critical software, is software that investigates software. Fuller describes two modes by which critical software operates. The first looks at evidence of normalized software in order to disclose how the process of normalization becomes manifest. Critical software works “by using the evidence presented by normalized software to construct an arrangement of the objects, protocols, statements, dynamics, and sequences of interaction that allow its conditions of truth to become manifest” (23). Critical software, then, engages the user in analysis of proprietary systems. The second way is to look at instances of software that run altered in order to reveals its mechanisms, mostly in the use of games. By “defacing” proprietary software, critical software makes apparent the processes of normalization.

Social software, the software whose production is written about by programmers, reveals the way in which software could be quite different if controlled by the general public. By producing a free alternative to expensive software, social software allows for an alternative and radical disaffiliation from proprietary corporations. Although each operator works individually, the venture becomes collaborative as isolation and fragmentation of individual programmers is overcome by the virtual community offered by the Web. Social software refers both to software that is built by those locked out of mainstream software and software born of social interaction, as opposed to proprietary software which means, fundamentally, that the public does not control what it does. Social software allows for the creation of an internal culture, one whose participants make public the labor and code proprietary software represses.

Fuller describes a third type of software, speculative software, as software that explores the possibilities of programming. This software takes blips - events in software - and reinterprets them, allowing users to investigate the potential inherent in programming. Fuller understands blips as residue that accumulates through the restraints of proprietary software With such restraints in place, proprietary software defines a certain range of motion that the user sees as the only possible relation one can have to a computer. Fuller sees speculative software as providing a space in which the user can reevaluate the possibilities of software and add critical analysis to traditional software by challenging the institutions that produce them.

Fuller continues with a discussion of the processual work of American artist Gordon Matta-Clark. Rather than offering whole products, Matta-Clark’s work offers residues, fragments, and partial constructions - architectural scraps become places of defiance in their very refusal of utility. By doing so, Matta-Clark questions the very idea of work. Fuller sees a parallel between Matta-Clark’s work and the potential of software: instead of offering possibilities for speed in solving problems, Matta-Clark’s conception of work embraces the leftover scraps that reside outside the program. Fuller begins his discussion of this topic by drawing a parallel between computers and architecture, both as instantiations of abstract logic. Matta-Clark’s work relies on understanding art in terms of interconnectedness rather than autonomy. Matta-Clark’s work investigates the material qualities of the structures, structures that privilege the malfunctions, voids, and shadows of urban space. Like Matta-Clark, Fuller investigates and identifies faults that are otherwise ignored, faults that lack any perceivable order, rule, reason, or design.

Programmers design user interface with the intent of enabling easy operation of a system. Fuller next investigates a system that rejects the traditional purposes of web design in order to investigate the technical/aesthetic potential of computer interface. The Web Stalker takes streams of data and reinterprets them in order to produce an artistic and political project. It is intended to put the user in control of the Web - although it is a network that does not look like the WWW we know. The Web Stalker is based on the belief that the user should be able to define the various functions they want to apply to a Web document, rather than being launched into a finished website. As Fuller writes,

The Web Stalker performs an inextricably technical, aesthetic, and ethical operation on the HTML stream that at once refines it, produces new methods of use, ignores much of data linked to or embedded within it, and provides a mechanism through which the deeper structure of the Web can be explored and used. (59)

The user opens up the Web Stalker as a blank screen and then builds windows to perform different functions: a crawler parses Web documents, and a map function creates a local dynamic map that uses circles and lines to represent URLs and links. The extractor grabs the text out of the particular document selected, and the dismantle window lists the components of a page. It attains its artistic goals by rereading the HTML stream of data, invoking new ways of registering the data and consequently threatening the determinable relationship on which commercial enterprises rely.

Fuller then turns his attention to Natural Selection, a search engine that chooses new criteria by which to pull up information. Typing a search string into the Natural Selection search engine either allows you access to the Net as usual, or, should you enter one of several thousand keywords, clicking on an apparently transparent search result drops you into its double. A traditional search engine retrieves information through procedures inflected by the demands of maintaining proprietary needs. Such needs make the information gathered by search engines predictable and refinable, producing a system that only pulls up contextualized information. Fuller explains, “The search engine is absolutely unable to treat a word or any collection of symbols entered into it in a contextualized manner…the core of what it has to act upon is the string of characters that it has just been requested to find matches or correspondences for” (71). Rather than developing a conflictual sensorium, search engines serve to classify sites through a hierarchy of terms disengaged from direct social or linguistic involvement. By adding one bit of diversity, the Natural Selection search engine serves to make apparent the uncontextualized and grounding incompleteness of regular search engines.

What the Natural Selection database allows the user to do is sense the implicit politics involved in the way information is presented on the Web. Mongrel, a group of artists interested in race, technology, and intelligence, produces Natural Selection. Fuller draws a parallel between Mongrel and the Situationists. The presumption behind the Situationist art movement is that under capitalism, the creativity of most people has become stifled, as society divides into strict, impermeable units of agents and spectators, producers and consumers. Influenced by Dada, Surrealism, and Lettrism, the Situationists tried to comprehend the new forms of state control and social disintegration, and opposed such control through the mobilization of passion in acts of dissent. The Situationists therefore wanted a different kind of revolution: they wanted the imagination, not a group of men, to seize power, and poetry and art to be made by all. Mongrel, like the Situationists, is interested in the construction of situations as the concrete construction of momentary ambiences of life and their transformation into a superior passional quality. Such an art movement helps us to think about software outside of the totalitarian constraints of the software oligolopies.

Fuller then addresses the problem - which he refers to as the ‘contemporary scourge’ - of information overload, a problem that produces an excess of useless information rather than a qualitative improvement in the acquisition of useful information. Because of a surplus of information, corporations try to market and control and influence what people see, which allows companies to influence what they purchase. Along with an overload of useless information, users are given a databased version of themselves that does not correspond to any reality but rather conforms to the needs of commerce and controllable systems of category. As Fuller explains, users are “acted upon by the labeling, classification, positioning, and fixing routines of databases” (125). Humans then get reduced to whatever the databases decide is relevant. The ‘data body’ produced through information networks inevitably becomes more important than the flesh and blood user, reducing the user to information whose very organization produces them. Humanity becomes something that needs to be overcome, an inconsequential vessel from which information is extracted and reorganized. Although many see the computer as having in a sense conquered the disorder of the natural world, what it does is dissolve the user into bits of information whose constructions stands not as an examination of the person but as an illustration of power.

Another example of interface that demonstrates disciplinary tactics is Microsoft Word. Word clearly operates on the metaphor of the office - hence, every piece of writing, from literary writing to data entry, takes place within the confines of a workspace. Documents produced on Word are positioned by the culture of doing business. Fuller begins his discussion by examining the way in which the ‘environment’ of Word affects and effects the piece of writing being produced. Microsoft Word also provides a seemingly infinite number of tools for the writer, as Microsoft Corporation must project what tools will be needed in new working environments. In essence, then, the user becomes absorbed into the apparatus of a program which has seemingly thought of every possible use in order to cater to every possible user. Fuller also interrogates some other features of the program, including the help feature, Microsoft dictionary, grammar check and the templates. Instead of representing real options for the user, these features represent the way we have become mindless consumers of extraneous options. For example, the help feature - which would be all but unnecessary if not for the excessive number of tools and icons - only functions if the user can identify the function with which he needs help, while the templates recognize that forgery is the basis of documents produced in the modern office. Fuller sees these tools as imposing a standard that organizes writing into a utilitarian and governed domicile.

So how does Fuller understand our ability to eliminate the core of a whole system of authority and domination? Fuller concludes his book by examining the possibility of producing software that is not overwhelmed by the ‘material-semiotic infrastructure of business’ but instead aids and encourages experimental and autonomous work, work that allows for the monstrousness of the Web without ordering it into utility. Allowing for this type of software would then transform the Internet into a real tool for communication, where people can exchange information and ideas, where non-commercial speech is still respected and valued. Fuller asks that the Internet not operate as a portal through which the user disappears; instead he wants the Internet to ramble in a vigilant cacophony, one fostered by software that does not solely serve the purposes of corporate power. Throughout his book, Fuller is addressing an indoctrination system based on private control over resources.

In order to rebel against our total absorption into an overarching private sphere, we must support some sort of public sphere on the Internet. We face this fact: there is no structure of popular institutions around which we can work. We therefore must appropriate private structures and make them our own. Behind the Blip illustrates the possibility of creating alternatives, while also exposing the implications of the use of software with strict commitments to power. Corporate control of software production is part of a larger effort to remove the public from making decisions over their own fate, to limit the public arena, to control opinion, to make sure that the fundamental decisions that determine how the world is going to be run are not in the hands of the public, but rather in the hands of highly concentrated private power. Instead of being a force for diverting the population from engaging themselves politically in the world, the Internet should be a force for democracy. If there is a largely public initiative, then it will reflect such a dire public interest, an interest that can be achieved when institutions are continually renegotiated and reshaped through free individual interaction and choice.