Binary XML has been a controversial and hotly debated topic in the XML community for many years. The XML 1.x syntax is very flexible and provides a common information representation for a vast array of systems. The XML marketplace has generated a seemingly endless collection of low cost, high quality, rapidly evolving technologies that make creating, sharing, manipulating, securing and accessing information easier. Systems that have adopted XML are cashing in on the economic and interoperability benefits of the XML marketplace. Some believe the introduction of a second, more efficient encoding for XML information would drastically reduce or destroy the flexibility or interoperability benefits of XML.
This will be the story of my life from the time my boss came to me and said, 'Hey, maybe we could do that Knowledge Base in XML. I hear good things about that XML,' to the time that I figured out everything I needed to know and deployed a fully functional XML knowledge base to the world.
To write a chapter about a topic which is so new and developing so rapidly that changes take place just about everyday is an interesting challenge. What I hope to accomplish in these few pages is to explain what electronic publishing is and explore a number of issues associated with this new area of information dissemination. Yes!, this is a new area of dissemination! And perhaps this is the place to start - by defining electronic publishing. Electronic publishing is a new form of communication. Electronic publishing, for the purposes of scholarly scientific presentation of results, is the creation of a scholarly work which is in a totally electronic (non-paper) form from its creation to its publication or dissemination. An electronic journal is a product that was specifically developed and designed for the Internet, a product which is not re-worked printed material that is delivered electronically. As I hope to show in this chapter, electronic journals and electronic publishing is much more than an alternat
The Web was originally conceived as a hypertextual information space; but the development of increasingly sophisticated front- and back-end technologies has fostered its use as a remote software interface.
Institutions familiar to the public are defined by master narratives that describe their activities and imply who is invited to take part. For art museums in this country, a master narrative of elitism was established in the last century, when museums organized and began building their collections. Because art museums were designed by the rich and subsequently forced to depend on the rich for financial support, the stories of elitism and exclusion have been perpetuated over the years. Whereas little narratives, or local stories, defining the daily operations of museums do not receive attention, stories of exclusive social events and obscure art exhibitions take prominence and discourage the participation of the general public. With diminished funding for museums and fewer courses devoted to art appreciation in public schools, museums will likely be unable to attract wider audiences to support them, and the master narrative will continue to define museums' image.
The usefulness of concept mapping as a job performance aid for writers of technical documents was examined. Thirty-four writers were randomly assigned to one of two groups. The experimental group received 2 hours of training in the use of concept mapping. Both groups revised the same chapter from a computer manual, and an experienced technical editor blindly evaluated each revision. In part two of the study, revised texts were given to two groups of users. One group received a concept-mapped revision, while the other group received a text revised by a writer who had used conventional revision techniques. Readers' comprehension was tested and compared. Revision time was not significantly different between groups, and the editor's ratings of quality were not different. However, readers' comprehension was significantly higher with the concept-mapped versions. These results suggest that concept mapping is a useful revision tool for writers.
Today technical communication departments are facing the challenge of producing a continuously increasing volume of technical documentation. Indeed, as companies accelerate the pace of new product launches in response to changing markets and competitive forces, so must the technical authors produce more, and faster, the accompanying documentation for these new products. We also recognize that information users are not a uniform group; they have different product knowledge, different backgrounds and may have different reasons for using a product. As such, they need specific, personalized documentation rather than a standard one-size-fits-all document.
In this article we will introduce the concept of WS-Management and Common Information Model (CIM). By exploring the SOAP message with multiple examples, we will learn how to transfer CIM operations through WS-Management SOAP messages.
Why XML documents aren’t a good fit for relational databases, how university professors are creating custom text books for students, and find links to several innovative projects that are demonstrating the power of XML and its cousin XQuery.
The intent of Sarbanes-Oxley (SOX) can be characterized as risk reduction: reduce errors, inhibit fraud, and provide shareholders with transparent equal-access to material knowledge. But implementation is principally procedural controls and documentation, under threat of penalty. The vague parts of SOX are where the real leverage lies: principles of intent, and corporate transparency.
People disagree on what happens when IAs grow up, but Tom Reamy offers a foundation for information architecture as it advances, grappling with problems across the enterprise.
This paper describes a platform for the XML definition of secure, intelligent web-based applications. XForms provides a powerful model-view-controller (MVC) pattern that may best be described as a cause-and-effect XML processing model originated by XFDL. This paper describes a new version of XFDL that consumes, or skins, XForms. Hence, this paper presents the first integration of the standardized XML markup for expressing the core processing of a web-based form applications (XForms) with a host language (XFDL) that offers security, precision presentation, a document-centric capability, and other features that contribute to a more rich user experience.
As a human society, we're quite possibly looking at the largest surge of recorded information that has ever taken place, and at this point, we have only the most rudimentary tools for managing all this information--in part because we cannot predict what standards will be in place in 10, 50, or 100 years.
These checklists pull together best practice in the disciplines of information design, usability and accessibility, into an easy to apply format. If you are already familiar with those topics, the checklists serve as a handy reminder that is easy to refer to and apply when planning navigation. If unfamiliar it's also a fast-track lesson - providing you with a head-start in getting it right and enables you to make better informed choices / compromises.
Many steps are involved in the process of turning an initial concept for a database into a finished product that meets the needs of its user community. In this paper, we describe those steps in the context of a four-phase process with particular emphasis on the quality-related issues that need to be addressed in each phase to ensure that the final product is a high quality database. The basic requirements for a successful database quality process are presented with specific examples drawn from experience gained in the Standard Reference Data Program at the National Institute of Standards and Technology.
Compared to ethics in technical writing, ethics in design has received less attention. This lack of attention grows more apparent as document design becomes ‘‘information design.’’ Since Katz discerned an ‘‘ethic of expediency’’ in Nazi technical writing, scholars have often framed technical communication ethics in categorical terms. Yet analyses of information design must consider why arrangements of text and graphics have symbolic potency for given cultures. An ‘‘ethic of exigence’’ can be seen in an example of Nazi information design, a 1935 racial-education poster that illustrates how designers and users co-constructed a communally validated meaning. This example supports the postmodern view that ethics must account for naturalized authority as well as individual actions.
This article gives a detailed encyclopedic overview of the many areas and concepts that fall within the domain of information ethics. Thus, it offers brief synoptic remarks on, for example, privacy and peer review, rather than in-depth discussions of these topics, many of which have generated thousands of studies, articles, and monographic treatments.
Are you aware that the practice of information architecture is riddled with powerful moral dilemmas? Do you realize that decisions about labeling and granularity can save or destroy lives? Have you been designing ethical information architectures?
This white paper explores the why's, what's, and how's of evaluating a web site's information architecture. It aims to raise consciousness about the evaluation of IA and to provide: 1) Web site owners and other decision-makers with an understanding of evaluation issues; and 2) Information architects with a synthesis of evaluation techniques.
In the city of Konstanz on the shores of Lake Constance, Siemens AG manufactures equipment for sorting post. Also at the same location, a team of 16 experts create the corresponding technical documentation. But their work is not restricted to handbooks and CDs. Since ten years, this department, called 'Technical Media', has also been taking care of multimedia and training.
Semantic technology can be as heavy and stifling for any audience as stem-cell research can be to high-school students. But Carla Thompson of Guidewire did a terrific job of coming up with discussion topics and moderating the panel. Everyone survived the ordeal without any sign of dozing.
The Dublin Core is currently the best-developed candidate for a simple resource description model for electronic resources on the Web. It represents the results of a three year process of consensus-building through a series of focussed, invitational workshops involving librarians, digital library researchers, and various content specialists from many countries.
Buzz about the value and implications of XML has reached an all-time high, with lofty claims of its potential to transform business and society, doing everything from simple document formatting to curing the common cold. I don't recommend you empty your medicine cabinet just yet. However, do take seriously the developments surrounding XML and its associated technologies. While XML might not merit all the hyperbole, it remains useful. Knowing how to apply this simple meta-language can help you create solutions that will give you a strong competitive advantage.
The value of full text for expanding information retrieval was examined. Two full-text databases were used: Textpresso for neuroscience and ScienceDirect. Queries representing different categories were used to search different text fields (titles, abstracts, full text and, where possible, keywords). Searching the full-text field relative to the commonly used abstracts field increases retrievals by one or more orders of magnitude, depending on the categories selected. For phenomena-type categories (e.g. blood flow, thermodynamic equilibrium, etc.), retrievals are enhanced by about an order of magnitude. For infrastructure-type categories (e.g. equipment types, sponsors, suppliers, databases, etc.), retrievals are enhanced by well over an order of magnitude, and sometimes multiple orders of magnitude. Use of combination terms along with proximity specification capability is a very powerful feature for retrieving relevant records from full-text searching, and can be useful for applications like literature-related discovery.