HYPER-LEX: A Technographical Dictionary

HYPER-LEX: A Technographical Dictionary

1996-12-30

Paul Harris hybridizes the terms of hypertextual discourse and takes it to a higher power.

The spirit or at least pervasive desire of our age revolves around a sort of transparency: a desire to project ourselves as a surface of permeable traces, to exfoliate, let the inside become the outside, to become fully visible like the meat and bones of a Cronenberg character, while remaining invisible like the little hacker ghost (Turing’s Demon?) that tracks text in the Random Access Memory banks of the machine onto whose screen we splash words. In large part, the attractive force that transparency exerts is an effect of media culture; simultaneously, however, transparency marks a limit of im-mediacy - an unmediated, collapsed sensation where we can see the neurophysiology of our brains or the shapes of and linkages among our words. This is an immediacy of the sensory that never shades into the tactile - it is rather the immediacy of sensing the medium itself, of clicking tracks around the computer screen or dredging up hidden treasures on the Netscape of our lives.

This transparency is embodied - enacted in a disembodying way - most clearly in the VR-user. Jacked in to an interface that both joins and separates mind and body, the VR-user gets to enter a proprioceptive universe both contingent on their movements and exhiliratingly alien. The body takes on a prosthetic virtual life of its own, a life then seen by a disembodied spectator’s viewpoint; and from this situated spot, the projected body image reveals the visual inside of the embodiedness of living biological infrastructure. From the outside, the VR-user lives an uncanny relation to the body-image; it becomes a double, but with the further twist that this doubling relation itself finds objectified form. The VR-user doesn’t so much experience the body as other as the very process of othering. Wim Wenders gives an imaginative twist to this mode of transparency in Until the End of the World, where a technology that records the neurophysiological impression of seeing, used by one character to let his blind mother see her children and the world, ends up being utilized as a way to record and then watch one’s own dreams. Characters quickly become walking zombies, narcissistically ensnared in the mirror of their dreams, addicted to the content of the unconscious encoded as visual information.

In order to develop a critical ecology of the culture of transparency, we may turn to Gregory Bateson’s writings. At the end of his life, Bateson posed a question whose answer is now beginning to take its amorphous shape: “Onto what surface shall a theory of aesthetics and consciousness be mapped?” The answer on the cutting edge, the leading surface at this juncture, would appear to be the network. While the key terms of Bateson’s question may seem nearly nostalgic (aesthetics as nostalgia for high art, consciousness as nostalgia for “presence”), there is a distinctly aesthetic pleasure apparent in the joy that network users find in their work. The aesthetics of the network are crystallized in the features of hypertext - the network as a set of links among lexia, or textual units. The network ultimately provides an image of consciousness, because consciousness is now perceived as essentially digital in nature, as a flow of discontinuous signals that result in daunting numbers. But “network” is a promiscuous and ubiquitous term, serving many functions in describing our modes of conduct and perception of the world: network serves as a structural design principle, modus operandi, technological environment and constraint, as a textual space and psychological model all in one. We think of social relations as networks, as well as television corporations and business ties; other more mundane, literal coinages also persist in daily parlance.

But once transparency appears on the screen or the net, once transparency is projected onto a network, it becomes curiously opaque. If we become transparent to ourselves in some warped Baudrillardian sense, then it quickly becomes apparent that we are nodes in the network, that the network as such will remain an unknowable system - an invisible territory, maps of which we continually redraw by surfing the net, but a territory which will remain several dimensions beyond our ken. The opacity of the network persists in more visceral ways than its merely implicit invisibility - it comes home to us when we experience the alternating ecstasy and frustration of reading hypertexts, writing with new softwares, or exploring the web. The limitations of speed, the crashing of lines or programs, or just the cutting edge that a skipping CD sears through the eardrums all point to an irreducible bluntness, a resistance of the medium, that we like to overlook when our jack-in glasses fit snugly or when we theorize transparency in sweeping terms. Technophiles will be quick to point out though that these sort of ups and downs are not in the network or tool at hand, but are a function of how we experience networks. This mode of experience is often described with metaphors that attempt to identify network-surfing with writing or graphic practice, but it might better be called the scene of clicking. And users could be thought of as bit-players, in that they are playing at manipulating bits of information, feeling free, but are in fact tiny nodes whose choices are prefabricated. The bit-player, in essence a signal skimming along over the surface of the net or the screen-text, is impatient with impediments. The bit-player prefers transcription to translation, wants transactions to be engineered by software with good genes, and likes well-engineered round, red tomatoes devoid of pocks and orange spots.

The topography of the network remains rather flat though, as if it were a set of switches and junctures along a homogenized, discrete track - the information superhighway seen by clicking train. Missing in this map of the digital are the peaks and valleys, the external image of our ups and downs, the sort of graphics found in representations of digital soundscapes or three-dimensional maps of distributions of things. If we think of the imaginative space of the network as a map onto which aesthetics and consciousness are projected, it would appear that language fans out onto flat surfaces; it is deployed in chains of metonymous units, and becomes a sequence of transcriptions that the user juxtaposes and skims, if not simply skips.

We witness the text receding into the medium that houses it, a new rendition of the well-worn medium=message riff. In a hypertext, for instance, the links are the structural logic of the virtual whole text; but the “links” do not link together in any way - they simply mark a transparent passage, an edge of difference that one passes through frictionlessly. Ostensibly, the “link” could point to a relation between two lexia, but the relation usually is either a literal, referential one (i.e., for more info on x, link to y), or a jump from one character-situation to a different scene, and the relation must be reconstructed after the fact. This operation, while it supposedly ensures the text’s flexibility in the hands of a commanding reader, actually lays out all the textual loops for the bit-player to click through in advance.

To fabricate a critical ecology in the context of hypertext writing, one seeks to maintain a certain duplicitious relation to the medium. On the one hand, one manufactures the critical ecology according to some of the rules of the hypertext game. (It is especially necessary to simulate the medium when the text is appearing on the Net, of course.) On the other hand, the critical ecology comes designed with an infrastructure that enables it to play itself out of some of the hypertext game’s constrictions. The basic idea is to create a “technographical dictionary,” to begin generating a working vocabulary for the culture of transparency, in a format that would allow for flexible usage and continual additions. Rather than unfolding in a literal hypertext format, the dictionary lays out an initial set of lexia. Then, rather than create links that transport from one term to the other, the lexia becomes a combinatoric device that creates hybrid terms. The hybrids must then be given definitions, so that the dictionary becomes a means to make new terms and concepts. As a result, the hybrids need longer entries than the definitions of initial terms - the entries on hybrids occasion short takes on aspects of transparency culture.

The design principle behind moving from a lexicon of discrete words to hybrid terms is quite basic: hypertext and most modes of experience in the scene of clicking are structured according to metonymic chains, as sequences of screens whose order can be changed. The combining of lexical entries into new terms is meant to mimic in a crude way the different sort of transport accomplished by metaphor - the carrying across between terms naming each other that constitutes the life of the metaphor, the between-terms that generates an emergent semantic richness. I try to inject a hypertext screenscape of discrete lexia with the permutational dynamics of combinatoric gaming - something the creative writers of hypertext seek out, of course, here done in a more literal manner.

What’s the point of such an exercise? The point is that on the one hand the sheer speed and immediate consumption with which we confront words now must be integrated into how we write. Academic writing should adapt to the fact that we all are glutted with stuff to read all the time, and have become consumers of articles and ideas - we skim and borrow, seeking to profit with the least work, more like the students we complain about than we care to admit. But on the other hand, a sleeker writing should also try to slow up or congeal; the words should merge into one another to take on texture, to be dipped in analog feeling for a moment as they pass through digital scenes. I don’t mean here that words should take on texture in the way that hypertext critics think this occurs, which amounts to stimulating the “look at rather than through” response. Rather than utilizing opacity as a source for significance, the basic gesture of combining lexical units into hybrid terms seeks to simulate bottom-up emergence, simple components combining to generate a different level of significance. The lexicon does not organize into a network, but a hierarchical distribution along which meanings slide as potential relations. The network trope blots out the sense of a hierarchical architecture that is so crucial to all living systems. The Hyper-Lex design simulates a critical ecology where emergent terms operate at a “higher” level of complexity, being more than the sum of their parts:

****************************

LEVEL I: LEXICON
book
brain
constraint
narrative
technography
virtual

****************************

LEXICON

BOOK

1. A technology for the storage and dissemination of words ostensibly to be read in consecutive sequence left to right, front to back; comes encased in cloth covers with a title and a proper name and some indication of its place of issue.

2. An object fashioned of paper and printed ink that generates a group of variable habitual behaviors by its users, including lying down in bed, going to a particular chair or couch, consuming a chosen food or drink, smoking or non. In addition, different species of the object gather around it constellations of certain speech patterns, such as the contested variations of terms from opaque jargons or enthusiastic exchanges of opinion about homologous objects.

3. An object with a peculiar life and circulation, often sought with zeal or owned with pride, that traverses and occupies spaces unique to its milieu, such as dusty garage corners, close-fit wooden shelves, car floors, cardboard boxes. Has difficulty finding its way to a final end.

BRAIN

Once accepted to be a physiological organ composed of 100 billion nerve cells, “wrinkled and grooved like an oversized walnut” (Tormont Webster 216), now a coveted imaginary object manufactured in contemporary myths emanating from several leading producers, including social and natural sciences and the humanities. A verbal attractor, a discursive site where we find projected the organizing principles of the natural world, the domain of information technologies, and even fictional texts. Definitive status uncertain - perhaps a viral growth that is entraining and entwining the discursive softwares of a culture, perhaps only a three-pound grey area where the world is being injected inside itself.

CONSTRAINT

A form of invisible architecture, providing the undergirding for the functioning of a system or ecology. This architecture works as a semi-permeable membrane, a virtual boundary; it provides the parameters that define a “context.” Constraint marks the passage from a system (e.g., language as a system of differential signs) to an environment, organism or instantiation (e.g., an act of writing with language circumscribed by constraints). Easily conflated with rules, constraints do not dictate what must occur or what components are or are not allowed at a given time. They rather quantify the “degrees of semiotic freedom” (Wilden) in a system; they could be thought of as the inherent limitations of any Umwelt, any evolutionary level of organization - the world of a rock is more constrained than that of a cat, and presumably we have more semiotic freedom than cats. In the contexts of communication systems, constraints generally reassert the materiality of any given medium - an important stipulation at a time when the “dematerialization” of media is often understood in a literal fashion.

Constraints are paradoxical-seeming in that on the one hand they draw a set of boundaries, while on the other they only impose stipulations, they mark out a potential domain within which any number of permutations or results can occur. Literature written under certain formal constraints remains a “potential literature” because every text that satisfies the constraints is only one possible solution or configuration. Calvino’s formal algorithm for city-types in Invisible Cities may be diagrammed in such a way that its invisible architecture implicitly enjoins a continuation of the text out into other invisible cities, presumably the ones we begin to imagine and inhabit from reading his book. Each chapter of Georges Perec’s Life A User’s Manual configures a puzzle whose pieces include no less that 42 daunting constraints - but even as we delight in his solutions, we can go back and write different stories that would fulfill the chapter’s place in the constraint structure, that would design different rooms in the building of the book, that would fit into its invisible architecture. As is most clear from Raymond Queneau’s Cent Mille Millards de Poemes, the products of constraint-guided potential literature are themselves machines for making stories.

[Harris rediscovers a senior American member of Oulipo in Harry Mathews’s Al Gore Rhythms.]

NARRATIVE

A word pertaining to the disposition of words, difficult to pin to a definition because of the subtle changes in the inflection of its usage. Beginning as a noun denoting a story or description of events, narrative has unfolded along a trajectory toward epistemology - narrative connotes a privileged cognitive capability or act, the capacity to organize and shape information and ideas into a coherent arrangement; it may be the prized mode of knowledge in our time.

As both a form of fiction and mode of knowledge, narrative has also become the technology for fashioning the self. In a philosophical mood beyond the subject, narrative may be taken as the locus of the play of the self - in post-colonial contexts, narrative enables the self to act out its ex-propriation and find an ex-centric place in the world. In a political and moral world bereft of fixed value systems, narrative assumes an association with rhetoric and therefore power; in a universe redrawing its distinctions between natural and artificial life forms, narrative remains a distinguishing feature of the human in a post-human setting.

TECHNOGRAPHY

1. The analysis of writing in terms of the technology of inscription. The study of writing as a material phenomenon, how its relations to different tools or media bear on its form, content, and history; moving toward greater emphasis on problems of graphic design. Represents part of a larger attempt to write an account of writing, which means to separate the act of writing (to write) from the material forms it takes (account of writing), even as it proclaims the indissolubility of the two.

2. A tool or means to write with, a technology for graphic inscription.

[Harris explores IN.S.OMNIA’s technographies in Sleepless in Seattle ]

VIRTUAL

1. The geography of the invisible, a spatial landscape composed in non-dimensional electronic environments that takes shape both physically and imaginatively as the realm of cyberspace. A topographical projection on screen, generated in a space within the computer where terms like distance and velocity no longer obtain their sense, where the “space between” is a matter of processing time. This topographical projection becomes a digital landscape dominated by the visualization of information. The visual forms range from crude mimetic representations (interface metaphors designed to transform the desk top into the desktop) to abstract geometrical patterns that express qualities or characteristics of systems rather than graphing their path in space (the geometry of non-linear dynamics).

2. A philosophical term developed in the work of Gilles Deleuze. For Deleuze, metaphysics is meta-physics, a play of thought at the boundaries of the physical, a form of speculative ontology - less a philosophizing about knowledge than speculations about the world, with the world itself perceived as a dynamic process always in the midst of articulating itself.

Deleuze posits that we may think of reality not in terms of an opposition between the possible and the real, but rather between the virtual and the actual. The distinction is that possible things do not have any sort of existence, they are empty signs, leaving the real as a realm of solid objects, akin to that of classical physics. The virtual, by contrast, persists as a sort of enfolded order of potentiality, from which things actualize by differentiating themselves.

The virtual is like a second-order metaphysical concept: a concept of conceptual space, a space of relations (and relations among relationships) not things, where pure structure inheres with a reality of its own, without becoming an object. The virtual marks the conditions or orientations of a task or project, but does not designate their solution (DR, 208). In this way, it has a status analogous to constraint; the virtual is the “space” “in” which constraints persist.

3. The two meanings of the virtual have the tendency to cross over (in the genetic sense). For instance, the desires of users, especially in the technographical field, often project a metaphysical quality of the virtual onto its immaterial computer screen environments. For a manifesto of this tendency, see the proceedings of the Artificial Life conferences; for a seething thought-experiment about this tendency, see Stanislaw Lem, “Non Serviam,” in A Perfect Vacuum.

****************************

LEVEL II: HYBRIDS
virtual brain
technographical brain
technographical book

****************************

HYBRIDS

VIRTUAL BRAIN

The virtual brain may be the central icon of our time, the image that best represents our sense of ourselves and the leading edge of imaginary collective definitions of the human. The virtual brain is the cumulative effect garnered from several sources, the by-product or offspring of multiple convergent factors.

On the one hand, brain research at the physiological level is making sufficient inroads to give the sense that we can generate a working model of the brain. On the other hand, there are always quantitative limitations on that knowledge, simply because of the sheer orders of magnitude involved in simulating brain operations, and so the models remain only potential explanatory metaphors. The slack left by this limitation in precision is picked up by the immense signifying power of computer graphics: from the microbiological to the cosmological, we tend to imagine physical events and processes in terms of the bright colors and seductive patterns of digitally generated diagrams.

The overlaying environment in which the virtual brain persists is both conceptually and technologically inflected. The conceptual dimension is provided by a general shift to a “bottom-up” model of brain processes in terms of self-organization and thought structures and capabilities as emergent properties (Rotman). The brain is seen as an immensely parallel, distributed hierarchy of sub-systems that are interconnected in exponentially complex ways. Conscious thought, emotion, and other cognitive behaviors are something like macro-scale outcomes of multiply layered series of microscopic components.

Our image of the brain gets its pervasive technological inflection in subtle ways. Within the rich play of the many metaphors and models that compare the brain to a computer, there emerges a sort of nexus where our image of the brain can only be made visible or accessible as a virtual image, something whose texture and form are indelibly imprinted by the computer’s graphic powers. Put differently, the virtual realm of the computer provides both the conceptual space and graphic environment where the bottom-up approaches to natural, physical, and artificial phenomena become pictures worth gigabytes of text: it is as if the arguments about whether or not the brain can be simulated by a computer are subsumed by the fact that we already project the brain as an object in virtual space - we transpose the wetware of neurophysiology into the software of connectionist networks.

And, as a final turn on the virtual brain, we discover that even when the brain as such is not the explicit target of inquiry, it provides the unstated Ur-model, it marks the standard by which all other things may be judged and to which they might aspire. The popular accounts of complexity theory as it is explored at the Santa Fe Institute, for instance, constantly cross the lines between biological organism and algorithm, ecosystem and simulation, material bodies and network configurations. Complex systems, the narrative runs, persist at “the edge of chaos” - balancing global stability with local perturbance. Stuart Kauffman argues that evolutionary selection targets complexity (and is partially inflected by it) because complex systems are more robust and continue to evolve in flexible ways. Interesting here though is how the defining characteristics or advantages of complex systems sound like an apology for the brain, a proclamation of its unique features. Kauffman argues that complex systems draw evolutionary advantage from their ability “to perform extremely complex computations” that then allow for “more complicated dynamics involving the complex coordination of activities throughout a network” (82). Almost all accounts of the brain begin with the point that it is precisely the entangled, parallel complexity of operations going on in the brain all the time that must be explained, and then voice wonder at how these operations are interconnected and synchronized. We reach a juncture where information-processing capacity, embodied (!) by the dynamics of a network, signify evolutionary stability - or, as Artificial Life proponent Chris Langton puts it, “the edge of chaos is where information gets its foot in the door in the physical world, where it gets the upper hand over energy” (Lewin 51). Once again, this claim underwrites several kinds of brain study - that the brain is the central control office for the body, that the grey cells transmitting signals convey information that then shapes the physical world.

TECHNOGRAPHICAL BRAIN

The technographical brain persists as a nexus of relations between our mind and the different tools we use to write and the different physical scenes created by writing machines. This notion of the brain follows Bateson’s thinking, for he thought of mind as both external and internal - mind is a collection of relations between differences that produce information, and this information lies in and is transmitted through pathways within the body and brain, but also between body, brain, and world (see “Substance, Form and Difference,” in Steps to an Ecology of Mind).

The technographical brain narrows the parameters of this concept to examine the ways in which the changing tools we use to write with effect physical changes in our brains. The technographical brain takes as its premise Merlin Donald’s idea that “We act in cognitive collectivities, in symbiosis with external memory systems. As we develop new external symbolic configurations and modalities, we reconfigure our own mental architecture in nontrivial ways” (382). This premise informs Vannevar Bush’s work, particularly the 1945 article often invoked as an origin for hypertext. Bush envisioned that information retrieval networks could combat the explosion of information available, and imagined a device he called the “memex” that would be a person’s own technology for storing relevant textual information. Bush called the memex “an enlarged intimate supplement to [a person’s] memory” (cited in Landow 15). For both Merlin and Bush, then, technographical machines represent external devices that in effect expand the boundaries of the brain, for they serve as extensions or prostheses of the brain that in a recursive turn then induce subtle changes in the brain’s very organization.

To write an account of the technographical brain in a contemporary context, we would need to yoke together analyses of how the computer as scene of writing shifts our relationship to language with different discourses about the brain and the evolution of its internal structures, or the history of changing ideas about that physiology. Our disposition toward the technographical brain will be shaped largely by our sense of how writing machines bear on the writer’s relation to text. For instance, the effect of the typewriter has been a subject of debate: McLuhan’s optimistic proclamation that technology acts as “extensions of man” and that the typewriter integrated the functions of writing, speech, and publication, are contrasted by recent accounts such as that of Friedrich Kittler, who sees the instrument as producing an effect of displacement, because it sets text off in a separate, windowed space from the hand, giving it a disembodied dynamic of its own. The same battlelines are being drawn with respect to technography now, from Richard Lanham’s championing of the “electronic word” as a democratizing instrument of empowerment and creativity that can unify the “arts,” to a series of critiques of “virtual realities and their discontents” (Markley) that explore the reinscription of Cartesian metaphysics in cyberspatial theorizings.

The technographical brain may be seen as a mutation of what we could term the narrative brain. The narrative brain alludes to the central place accorded the capacity to narrate in studies of the brain. We see this capacity implicated in Daniel Dennett’s notion of the “multiple drafts model” for how consciousness results from an intricate filtering and sifting activity across several hierarchically distributed levels of neural processing. In this account, the stream of consciousness William James gave us as a sense of our thought processes is replaced by a dynamic of authorial selection, done by a de-centered author - that author not as central intelligence agent but as a fiction, an illusion of unity imposed retrospectively on the selected result of these entwined, parallel operations.

The technographical brain is also being generated by the propulsion in some discourse on hypertext to argue that hypertext is more congruent with contemporary conceptions of mind. Landow sees Bush as a pioneer in imagining “machines that work according to analogy and association, machines that capture the anarchic brilliance of human imagination” (18). In essence, we find increasingly that hypertextual writing in electronic environments is seen as somehow more “natural” because it recapitulates the structures and patterns of thought, that it is more congruent or in synch with them. And so, as our textual activities and products become like brains, once more we find the desire for transparency: for a transparent relation or at least formal homology between the structuring principles of hypertext and those of the brain. Such formalizing abstractions frequently figure the virtual as an immaterial realm, and collapse the idea of “text” into that of its medium (Grusin).

The technographical brain, then, should remain a potential configuration only, irreducibly distributed among the relations between brains, bodies, technograpic tools, and a medium of expression - one also in turn demarcated by the constraints of its physical means (a charcoal pencil) or its software (the design of Storyspace).

[Harris moves from “network to membrane” in Constrained Thinking.]

TECHNOGRAPHICAL BOOK

For some, this hybrid will sound like an oxymoron, as it evokes the persistence of the book in the technographic age. But this oxymoronic quality is intrinsic to the nature of certain novels that inscribe in their narratives the end of the book. The essential conflict between the two terms stems from the way in which the technographic medium changes the way that a reader inhabits and navigates a textual space. In hypertext environments, for example, we enter into a “non-linear” setting, meaning that we can alter the sequence in which we read different lexia, choosing to pursue different possible links among them. The entire mode of both clicking on words and reading them on screen entails a phenomenology that has yet to be described in adequate terms, but will surely change our sense of the way that fictions induce us to create imaginary worlds from our embodied setting. The non-linear clicking through lexia is contrasted sharply with the ostensibly fixed sequence of printed words. A crucial corollary issue surrounds the way that hypertexts cannot be said to exist as a single whole, because we refashion them each time we read; whereas the book as an object already confers a larger sense of “unity” on its pages. This unity also exists at the abstract level of “form” - the formalist values and tenets about books instilled by New Critics persist still in the educational processes by which we are socialized into the world of literature.

The technographical book, then, signals a sharp shift in the novel’s sense of its own identity, in a direction consistent with several formal features of hypertext or other forms of electronic writing. To take a specific example, consider the way that Dominic Di Bernardi begins his Afterword to Jacques Roubaud’s novel, which he entitles “The Great Fire of London and the Destruction of the Book”:

It would be no exaggeration to state at the outset that Jacques Roubaud’s The Great Fire of London will be one of the last books of its kind. Despite his self-proclaimed status as Homo lisens, a man who reads, and a lover and reader of books above all, the author elaborates a strategy both for reading and for text presentation that for all practical purposes makes paper-print obsolete. Yet this obsolescence is informed by a passionately pursued dedication to the most time-honored literary traditions, both in written narrative form and their earliest manifestations as transcription of orally composed verse. (323)

Di Bernardi goes on to remark several salient points about the technographical book. He notes that there are several novels that absolutely insist on readings that refuse to follow linear sequence or any traditional trajectory of plot, including Cortazar’s Hopscotch , Pavic’s Dictionary of the Khazars , Julian Rios’s Larva: A Midsummer Night’s Babel, Butor’s Boomerang, and Calvino’s If on a winter night a traveler. Such formal innovations then engender “a new way of actually inhabiting the space of the book, of using the book as an object” (326). But what is interesting, Di Bernardi points out, is the immense bookishness that characterizes these post-print novels - the interlaced textual histories of Pavic’s Khazar nation, the translation and publication world in Calvino, the commentaries supplied by Rios’s Herr Doctor character. And so this move toward a mode of text writing and reading, seemingly so bent on leaving the world of the book behind, reinscribes a sort of graphophilia.

[see Todd Napolitano for a take on the “graphomania” inspired by the World Wide Web]

These technographic books seem the nearly inevitable extrapolation of the metafictional trajectory we can trace in the novel throughout the second half of the 20th century. Technographic books emerge as the history of writing itself comes to be rewritten with a new urgency, within the annals of literary theory, in the novel, and in historical accounts of print and computer textuality alike. And so as the novel became conscious of itself as not only a medium, but as a sort of ontological play where textual worlds came into existence, it began to reflect on its own history from a viewpoint that took a retrospective turn.

One way that this theoretical swerve manifests itself is in the consistent sounding of the theme of lost or destroyed texts that we find in technographic books. From the hilarity of the reader’s search for one novel after another in Calvino to the spurious versions of the Khazar dictionary, we discover the theme of the mutilated text replicating the technographic book’s own sense of the end of the novel. At a more complex philosophical level, we find a new inscription of the link between writing and forgetting handed down from the Egyptians and Plato. Roubaud writes his book in dedicated memory to his beloved dead wife, but nonetheless envisions in his novel a great fire that will burn a city of books, signalling what Di Bernardi calls “the destruction of all memory,” in a novel “whose interweaving electrographic flames and ‘luminous script’ erect a monument to its obliteration” (330).

This dynamic of writing and forgetting that plays itself out in the technographic book that leaves a world of print novel behind (even as it remains in that world, of course) - this dynamic is a crucial complement to the whole ideal of technography as an extension of memory. The technographic brain finds an origin myth in Bush’s memex, a personalized electronic memory bank/text. However, it is a simple fact that the storage of exponentially increasing amounts of information and the increasing access we supposedly have to it through computer searches also entails an increasing entropy of cultural memory: we will have more and more to forget because the percentage of what we can retrieve will be so small.

The technographic book, then, fulfils the cultural function of expressing several dimensions of writing in the information age that have yet to be fully recognized. It thus finally may simply carry on the tradition of the novel - the novel construed as, in Salman Rushdie’s words, “the most freakish, hybrid and metamorphic of forms” (425). If we want a simple term to build around, perhaps we could use Calvino’s brief apology for the hypernovel in Six Memos for the Next Millenium - the novel that sets out a series of potential novels, that can be configured in any number of ways, that we can traverse in different directions and that encourages us to continue writing its stories outside the covers of its book-skin.

Works Cited

Bateson, Gregory. Steps to an Ecology of Mind. New York: Ballantine, 1972

Calvino, Italo. Invisible Cities Trans. Warren Weaver New York: Harcourt Brace Jovanovich, 1972

__________. Six Memos for the Next Millenium. Cambridge: Harvard University Press, 1988

Di Bernardi, Dominic. “Afterword,” in Roubaud 323-30

Donald, Merlin. Origins of the Modern Mind. Cambridge: Harvard University Press, 1991

Grusin, Richard. “What is an Electronic Author? Theory and the Technological Fallacy,” in Markley, ed. 39-54.

Kauffman, Stuart. “Antichaos and Adaptation.” Scientific American. (August 1991): 78-84.

Landow, George. Hypertext: The Convergence of Contemporary Critical Theory and Technology. Baltimore: Johns Hopkins University Press, 1992

Lanham, Richard. The Electronic Word: Democracy, Technology, and the Arts. Chicago: University of Chicago Press, 1993

Markley, Robert, ed. Virtual Realities and Their Discontents. Baltimore: Johns Hopkins University Press, 1996

Perec, Georges. Life: A User’s Manual. Trans. David Bellos. Boston: David Godine, 1982

Rotman, Brian. “Exuberant Mortality–De-Minding the Store.” Configurations 2(2): Spring 1994, 257-74

Lewin, Roger. Complexity: Life at the Edge of Chaos. New York: MacMillan, 1992

Roubaud, Jacques. The Great Fire of London: a story with interpolations and bifurcations. Trans. Dominic Di Bernardi. Normal, IL: Dalkey Archive, 1991

Rushdie, Salman. Imaginary Homelands. London: Granta Books, 1981

Wilden, Anthony. System and Structure: Essays in Communication and Exchange. London: Tavistock, 1972