My presentation for the Digital textuality meeting in Lyon, September 2005
(in progress)
Background
At our last meeting, I presented the concept of feral hypertexts, hypertexts that have gone wild on the net and that defy the kind of pre-planned structures that we have traditionally seen as necessary to steward our collective knowledge. Examples of feral hypertexts include weblogs, wikis and other ìbottom upî or self-organising systems of texts. There are in fact several ways in which such texts organise, but rather than being hierarchical and centralised, where themes are predefined by a central editor or group of editors, they are bottom-up, providing flexible structures which can be filled by a vast and changing group of contributors. Themes emerge, and are visualised by the infrastructure of the system, through devices such as collaborative editing (Wikipedia), tagging or a folksonomy (Flickr, Del.icio.us, CiteULike) and trackbacks (weblogs).
At this meeting I would like to explore in more detail what this might mean for our praxis of developing and editing critical editions of texts, inspired by the discussions we had in Bergen a few months ago, and the examples of critical editions I saw in the research group.
Weblogs
Weblogs are clearly a form of writing that is growing in importance in our culture. Numbers are still fairly low, but increasing steadily: 11% of surveyed internet users read blogs monthly or more frequently. Weblogs might be preserved and critical editions developed in the same way as the diaries and letters of notable people have been published and edited after their deaths. Some weblogs might be compared to journals, as in archive.org, also national libraries are doing this.
Ethics
We publish diaries and letters, but not until the author is dead or famous. Creating critical editions of current weblogs would be problematic – while they are published, and many have clear literary qualities, writing reviews of weblogs causes some authors distress, and even linking to them outside of context can be seen as exploitative or predatory. “Community is not a drive-by phenonomen”, Profgrrrrl wrote.
Webloggers themselves have developed tools for seeing structures and connections, such as Technorati.
The Wikipedia
The Wikipedia is a prime example of an apparently feral text that works extremely well. It has an interesting history that is useful to understand if we wish to think about how we can best take care of knowledge in the future.
The Wikipedia grew out of Nupedia (en bref “>en FranÁais), which was intended as a free content encyclopedia that would be written by volunteer experts and subjected to rigourous peer review. The bar to become a Nupedia contributor was relatively high, with the policy stating, “We wish editors to be true experts in their fields and (with few exceptions) possess Ph.D.s. Only a handful of articles were ever completed.
In many ways, the Wikipedia is its own critical edition. Its evolution is made explicit in the discussions that are accessible from each individual page. There are many discussions about strategy, and many different philosophies of how to create the best possible Wikipedia. All this is available on the site.
Feral texts as performance
Perhaps it is more useful to think about new kinds of textuality as more akin to performances than to the texts produced in the 19th century. Walter Ong suggested that our electronic media might be viewed as a secondary orality, and the living web has much in common with oral traditions.
Another matter is how to preserve the whole text. The image shows the roll of paper tape that stores the first BASIC interpreter, programmed by Bill Gates and Paul Allen in 1977. Is this the “text”? Or is one of the many versions later implemented “the text”? Another example: Eliza/Doctor, which was first made for teletype and now exists in IM bots and the web. Which versions do we want to preserve for scholars 50 years from now?
Perhaps instead of creating digital editions of such texts, we should create samplers, as Charles Perrault and Asbj¯rnsen og Moe did with French and Norwegian folk tales. We need to think about collecting these cultural riches while they are here. Or we might take a look at performance art, and consider that the performance may be documented by the artist in a manner that will explain something of the original, though it can never completely replicate it.
Print allowed us the luxury of thinking we have access to the full text. That luxury may be gone.
Andreas Haugstrup
The link to your article is broken (the “f” and “h” needs to be capitalised). The links in your sidebar are fine. I’m looking forward to reading. :o)
Jill
Thanks! Fixed that now. I love instant error-checking 🙂
Jill
Why would we WANT critical editions of digital texts?
In the discussion afterwards, three reasons for making critical editions in general were mentioned:
To establish the authoritative version of the text (bloggers, for instance, would probably not want that)
To establish the geneis of a text (built in to the Wikipedia, for instance)
To piece together fragments of texts and try to reconstruct what you believe the text to be (which is similar to the first or maybe the same.
Perhaps we’ll want to do this for weblogs an dother online texts in the future, but we don’t need to yet.
Possible kinds of “critical editions” of weblogs:
The Complete Works of William Gibson would need to include his weblog (with annotations)
You could make an edition of online writing about an event (the tsunami, Sept 11) like the archives that exist for such events, but annotated, cross-referenced etc.
Maybe in the future there’d be interest in a properly cross-referenced edition of discussions in a particular weblog cluster (political blogs, new media research blogs, scandinavian blogs, whatever) – such a project might want to make a selection of included weblogs, annotate etc.
There is something dangerous in trying to provide a clean, authoritative edition of something as messy as weblogs, but perhaps someone will try to do this in the future.
Bill Tozier
A project I’m heavily involved in — Distributed Proofreaders, which is a collaborative content provision system for Project Gutenberg — holds an interesting role in relation to your premise. As you know, Project Gutenberg aims to “republish” in electronic format any and all works in the public domain. Since its start in the 1970s (!), it’s focused mainly on purely textual works, and has been thoroughly biased towards the canonical classics. Nowadays though, the massive input of Distributed Proofreaders (and I suppose Google Whatever, eventually), has accelerated to growth of the “archive” to the point that very obscure texts are now being provided, including an increasing number of periodicals, and also multiple editions are being uploaded.
A few points that seem salient:
– We’re reaching a stage where images are much more easily included, and HTML versions produced. Is the original Gutenberg version of an illustrated text “whole”, or is the HTML version? Or are only the scans of the original book’s pages, if they cannot be searched? In a number of texts I’ve personally scanned, typography is used in intricate ways to signal not merely emphasis and layout, but subtle in-jokes and elegant solutions to increase the information density of figures. Where is that captured in our new editions?
– We (my wife and I personally) have made a specialty of obtaining and scanning the periodicals of the 19th century. Often as not, there has been no significant academic critical commentary about these works in 150 or more years, even though the authors and editors in many cases have gone on to become rather well-known (some fellow named “Wirving” or “W. Irving” or something, Sidney Morse [Samuel’s brother], some unknown Henry Holt guy, folks like that nobody ever heard of). Indeed, at and since the time of their publication, magazines were more or less neglected, even though many had circulations in the tens of thousands.
I often find myself thinking about what’s happening to them, now that we’re republishing them and letting people Google them directly. If modern dynamic texts are feral, what sort of creature are we making from The Knickerbocker Monthly Magazine? If we allow passers-by to annotate them in a wiki?
– Living systems offer a rich source of analogy, beyond the borrowed phrase “living text”. So, here’s a salient question from the depths of theoretical biology; what individuates a text? An organism can in theory (and often in practice does) have every molecule replaced over time, and remain the same individual; does a text?