The final decade of the last century witnessed the dramatic rise of hypertext as a literary, technical, social, and intellectual phenomenon. Today, despite the fact that hypertext provides the conceptual underpinnings for the World Wide Web (among other things), 'hypertext' remains a relatively peripheral term. In this talk, I'll track some of the ways that 'hypertext' has been articulated during the last five decades, describing how the social construction of hypertext inscribed the technology(ies) in limiting and ultimately self-defeating ways. I'll then attempt to track (and construct) some possible futures for a dramatically redefined hypertext, one constructed as an 'ethic of reference' within and among social communities rather than a technical practice.
The nature of hypertext challenges many underlying assumptions for traditional literary critics. Literary critics frequently like to think that they have objectively looked at the lexias of the work, thoughtfully considered them, and constructed a solid interpretation or analysis of the work based on those lexia. Hypertext, however, presents the possibility that two critics who are reading the same work may have differing sets of lexia from which to work. Thus, even if critics objectively consider the lexia before them, they cannot free themselves from the subjectivity of the reading performance that made those lexia (and not others) appear. This raises the concern that, if hypertext critics can only present subjective views of the text, there may be little or no benefit to reading or writing those critiques.
Since this is going to be a wild ride across a some disciplines that don’t normally talk to each other, let me start with a short, structural overview to get everyone situated. I’m going to begin by defining some terms. They’re all relatively simple, common terms, but I’m going to attempt to bring them together in a particular configuration; in order for that configuration to make sense, I need to settle on some loose definitions and, at the same time, make the terms relevant to our discussion. Next--and this is probably the bulk of the talk--I’ll be outlining a geneaology of work, particularly as it relates to interface design. In this history, I’m interested in understanding, from a critical perspective, what happens to work as it increasingly takes place within the computer interface. I’ll say here that the end of this history is where the terms “postmodernism,” “work,” and “interface” come together. Finally, I’ll offer some suggestions—and examples—of ways that we -- as teachers, researchers, designers, communicators -- can begin to deal productively with some of the problems I see with how interfaces are currently being designed and used.
This presentation traces the locations and roles of computer documentation over the latter half of the 20th century in order to construct a model of information/knowledge space as it relates to different forms of work. The paper then provides suggestions about future forms of documentation and interface based on ethnographic research of workers in recently emerging forms of work, including nonlinear audio/video production and videogame playing. The final section of the paper provides concrete suggestions about forms of documentation and interface that will be required to support these new forms of work.
The personal computer has had a significant impact on the delivery of educational material. Hypermedia systems give students the ability to explore concepts in innovative ways. Unfortunately, it appears that many hypermedia designers have ignored the critical early planning stages. This paper provides an overview of three of those planning stages: audience analysis, system goals analysis, and control analysis.
There are many models of hypertext, distinguished by a number of factors such as the underlying semantic data model (link typing and node typing), the degree of dynamic linking in the hypertext, and how dynamism and other behaviours are implemented. This essay examines a particular approach to dynamism in hypertext, based on the degree of similarity between a text passage in a source node and the text of a target node. It reviews work carried out over the past decade in creating systems for markup-based querying and dynamic hypertext, with particular emphasis on a model of dynamic hypertext that computes hypertext links on the fly using queries.
Fifty years before web, 30 years before the personal computer, Vannevar Bush envisioned a new machine to make sense of the growing mountains of information, creating the notions of 'hypertext' and the modern link.
World Wide Web authors must cope in a hypermedia environment analogous to second-generation computing languages, building and managing most hypermedia links using simple anchors and single-step navigation. Following this analogy, sophisticated application environments on the World Wide Web will require third- and fourth-generation hypermedia features. Implementing third- and fourth-generation hypermedia involves designing both high- level hypermedia features and the high-level authoring environments system developers build for authors to specify them. We present a set of high-level hypermedia features including typed nodes and links, link attributes, structure-based query, transclusions, warm and hot links, private and public links, hypermedia access permissions, computed personalized links, external link databases, link update mechanisms, overviews, trails, guided tours, backtracking, and history-based navigation. We ground our discussion in the hypermedia research literature, and illustrate each feature both from existing implementations and a running scenario. We also give some direction for implementing these on the World Wide Web and in other information systems.
In the short term of three to five years, I don't really expect significant changes in the way hypertext is done compared to the currently known systems. Of course new stuff will be invented all the time, but just getting the things we already have in the laboratory out into the world will be more than enough. I expect to see three major changes: the consolidation of the mass market for hypertext; commercial information services on the Internet; the integration of hypertext and other computer facilities.
To understand how much content effluvia we're subjected to, I wanted to see how many links are on the homepage of popular websites. For example, if I go to the homepage of the Huffington Post, I see 720 links, in one shot. Then click inside to a story and you've nearly doubled that number—it ads up pretty quickly. What about the tech blogs? BoingBoing Gadgets, 514. Gizmodo, 468. Engadget 432, all on one page. And on average, fewer than 1% of the links on news sites and blogs actually point to rich content, 99% are navigation and other article headlines. Aggregation site Techmeme has a whopping 1081 links.
Hypertext is a novel approach to computer-based information management based on associative indexing. The concept in general and the characteristics of typical systems are briefly reviewed. Strategies for applying hypertext techniques to the process of writing a technical document are examined. The way in which hypertext documents are used is discussed, focusing on a commonly encountered problem -- user disorientation within the document. Hypertext-based technical documents are compared and contrasted against their paper-based antecedents.
The Hypertext Functionality field studies techniques for and the impact of supplementing everyday computer applications with hypertext (or hypermedia) functionality (HTF). The HTF approach encourages system developers to think actively about an application's interrelationships, and whether users should access and navigate along these relationships directly. It views hypertext as value-added support functionality. The HTF approach fosters three major areas of research: using HTF to improve personal and organizational effectiveness, HTF and application design,and integrating HTF into applications.
A discussion of some of the most compelling elements of current hypertext theory. By practicing the theory it preaches, it hopes to explicitly model the theoretical interrogations of the issue.
Because of the nature and complexity of collaborative work, there is currently much interest in examining computer support for team endeavors. Hypertext technology is particularly suited to providing such support. Many current hypertext applications support collaborative endeavors in diverse fields. Rensselaer’s Design Conference Room (DCR) is an Electronic Meeting System facility intended to support mechanical and software engineering design teams. Teams meeting or working in the DCR have access to sophisticated networking and hypertext technologies. Careful study of the processes and products of DCR team will contribute to an understanding of how hypertext (and other computer technologies) can best support team endeavors.
This proposal concerns the management of general information about accelerators and experiments at CERN. It discusses the problems of loss of information about complex evolving systems and derives a solution based on a distributed hypertext system.
Nobody is offering courses in how to prepare hypermedia, nor are there a large number of jobs available for hypermedia authors. As we begin to come up against the limits imposed by the volume of existing knowledge, we will eventually be forced to place more importance on managing our information explosion.
The 'article' approach is better than the 'card' (or 'topic') approach. Concatenate your hypertext nodes and format the headings relatively, for increased comprehensibility of large amounts of conceptual material. Placing node bodies contiguously enhances visibility of information structure.
I have twisted the language to contrive the title of this essay because I want to interrogate the future of literacy, both its electronic formations (if indeed these differ from its pre-electronic ones) and its social origins and effects. Hence: I am using the unpronounceable locution e-literacies in two different ways: first, to mean those reading and writing processes specific to electronic texts (by texts, I mean a whole range of digitally encoded materials -- words, sounds, pictures, video clips, simulations, etc.); second, to signify elite-racies as in those socio-economic elites whose interests might be served by electronic literacies of one sort or another, or who might come to be elites by virtue of their ability to shape electronic literacies.
The production of a web page has become a common assignment in a number of university classrooms, but there has yet to be established a pedagogy for the generation of large group-generated web sites that replicate the methods found in industry. In Studies in Hypertext, a course offered to technical communication students at the University of Central Florida, such a pedagogy is being shaped. In this course, students with little or no experience in web site generation work their way through a series of written and small web site construction tasks to eventually produce one complex and competently-integrated web site.
The hypertext world has classically distinguished between two fundamentally different ways of presenting hypertext nodes on the screen: scrolling and cards. Throughout the history of hypertext, designers of hypertext systems have argued about the relative merits of these two contrasting approaches. The proponents of the scrolling model are sometimes called the holy scrollers and the proponents of the card model are called the card sharks. Here are examples of documents I have authored myself in these two models, using pre-WWW hypertext systems.
When you place content, Adobe® InDesign® 2.0 doesn't just add the graphics and text to your document—it keeps track of the original files as well. You can use the links to update the data if the original file changes, to track down missing graphic information, or to replace a graphic with another, without losing the transformations you've applied. And when you work with text files, it's usually best to remove the link altogether.
L'ennemi de l'hypertexte, c'est l'hypertexte lui-même... Abusez de l'hypertexte et vous ne tarderez pas à dérouter votre visiteur. Evitez donc l'effet 'labyrinthe' dans la mesure du possible ! Un utilisateur ne devrait jamais avoir à explorer des forêts de boutons pour obtenir de simples informations.