A directory of resources inthe field of technical communication.

Heuristic Evaluation

15 found.

About this Site | Advanced Search | Localization | Site Maps
 

 

1.
#28931

Content Analysis Heuristics

Many Web professionals consider content inventories critical parts of most projects. Are there certain specific things to look for during a content inventory? Fred Leise definitely thinks so. He proposes a set of content analysis heuristics and discusses how to utilize each one.

Leise, Fred. Boxes and Arrows (2007). Articles>Content Management>Taxonomy>Heuristic Evaluation

2.
#10406

Developing Heuristics for Web Communication: An Introduction to This Special Issue   (peer-reviewed)   (members only)

This article describes the role of heuristics in the Web design process. The five sets of heuristics that appear in this issue are also described, as well as the research methods used in their development. The heuristics were designed to help designers and developers of Web pages or sites to consider crucial communicative aspects of Web site design. Also previewed is a sixth article that presents a framework for characterizing and analyzing the broad variety of heuristics that are available for Web designers.

van der Geest, Thea and Jan H. Spyridakis. Technical Communication Online (2000). Design>Web Design>Assessment>Heuristic Evaluation

3.
#15117

Documentation Metrics: What Do You Really Want to Measure?  (link broken)   (PDF)

Examines several metrics--systems for measuring production and production standards--to determine their value to technical communicators. He argues that qualitative metrics are more meaningful than quantitative ones.

Le Vie, Donald S., Jr. Intercom (2000). Articles>Documentation>Assessment>Heuristic Evaluation

4.
#18726

Guía de Evaluación Heurística de Sitios Web   (peer-reviewed)

Este documento tiene por objeto servir de guía general para la evaluación de la usabilidad de sitios web. Es una versión resumida de la guía que nosotros utilizamos en nuestra actividad profesional, aunque lo suficientemente extensa y específica como para resultar de utilidad a aquellos profesionales que requieran de un documento base (que poder extender según sus propias necesidades) con el que empezar a trabajar en evaluación heurística. La Guía está estructurada en forma de checklist, para facilitar la práctica de la evaluación. Como se puede observar, todas las puntos están formulados como preguntas, dónde la respuesta afirmativa implica que que no existe un problema de usabilidad, y la negativa que si.

Hassan Montero, Yusef and Francisco Jesus Martin Fernandez. Nosolousabilidad.com (2003). (Spanish) Articles>Usability>Methods>Heuristic Evaluation

5.
#26654

Heuristic Evaluation

A usability evaluation method in which one or more reviewers, preferably experts, compare a software, documentation, or hardware product to a list of design principles and list where the product does not follow those principles.

Usability Body of Knowledge. Resources>Usability>Methods>Heuristic Evaluation

6.
#26839

Heuristic Evaluation

Heuristic evaluation is a form of usability inspection where usability specialists judge whether each element of a user interface follows a list of established usability heuristics. Expert evaluation is similar, but does not use specific heuristics. Usually two to three analysts evaluate the system with reference to established guidelines or principles, noting down their observations and often ranking them in order of severity. The analysts are usually experts in human factors or HCI, but others, less experienced have also been shown to report valid problems. A heuristic or expert evaluation can be conducted at various stages of the development lifecycle, although it is preferable to have already performed some form of context analysis to help the experts focus on the circumstances of actual or intended product usage.

UsabilityNet (2005). Resources>Usability>Methods>Heuristic Evaluation

7.
#30041

Heuristic Evaluation Quality Score (HEQS): A Measure of Heuristic Evaluation Skills   (peer-reviewed)

Heuristic Evaluation is a discount usability engineering method involving three or more evaluators who evaluate the compliance of an interface based on a set of heuristics. Because the quality of the evaluation is highly dependent on their skills, it is critical to measure these skills to ensure evaluations are of a certain standard. This study provides a framework to quantify heuristic evaluation skills. Quantification is based on the number of unique issues identified by the evaluators as well as the severity of each issue. Unique issues are categorized into eight user interface parameters and severity is categorized into three. A benchmark computed from the collated evaluations is used to compare skills across applications as well as within applications. The result of this skill measurement divides the evaluators into levels of expertise. Two case studies illustrate the process, as well as its applications. Further studies will help define an expert's profile.

Kirmani, Shazeeye and Shamugam Rajasekaran. Journal of Usability Studies (2007). Articles>Usability>Assessment>Heuristic Evaluation

8.
#23862

Heuristics to Evaluate Online Help

Creates a set of questions for each usability category for the person performing the heuristic evaluation with a range of very satisfied to very unsatisfied to not applicable. Each question can have a severity level that can raise significant opportunities for improvement to the foreground.

DeBoard, Donn. Usability Interface (2004). Articles>Usability>Methods>Heuristic Evaluation

9.
#20824

How to Conduct a Heuristic Evaluation

Heuristic evaluation is a usability engineering method for finding the usability problems in a user interface design so that they can be attended to as part of an iterative design process. Heuristic evaluation involves having a small set of evaluators examine the interface and judge its compliance with recognized usability principles (the 'heuristics').

Nielsen, Jakob. Alertbox (1994). Articles>Usability>Methods>Heuristic Evaluation

10.
#28357

Metrics for Heuristics: Quantifying User Experience (Part 2 of 2)

In part one of 'Metrics for Heuristics,' Andrea Wiggins discussed how designers can use Rubinoff’s user experience audit to determine metrics for measuring brand. In part two, Wiggins examines how web analytics can quantify usability, content, and navigation.

Wiggins, Andrea. Boxes and Arrows (2006). Articles>User Centered Design>User Experience>Heuristic Evaluation

11.
#23973

Not All Web Sites Are Alike

Many people have a hard time talking about the distinctions between different kinds of Web development, which makes it difficult to decide how to proceed. This article offers a quick survey of various Web projects and of the techniques that address them.

Korman, Jonathan. Cooper Interaction Design (2003). Articles>Web Design>Usability>Heuristic Evaluation

12.
#30091

Quality Metrics    (PDF)

Technical communicators continuously battle with the problem of obtaining an objective and comparable representation of a document's quality. While everyone agrees on the importance of this issue, no definitive work exists on determining or representing the quality of a document. As Toby Frost states, 'It may take a long time to establish a baseline for quality metrics; we don't have adequate mechanisms for measuring quality today.' A quality metric provides a method for tracking a document through completion, helps ensure quality deliverables, and provides an additional criterion for personnel reviews.

Mallory, Eric. STC Proceedings (1999). Articles>Writing>Assessment>Heuristic Evaluation

13.
#25901

Take Breaks! A Simple Way to Improve Your Heuristic Evaluation Results

As primary tools in the usability field, heuristic or expert evaluations can be rich areas for methods studies and improvement. Early results of one methods study suggest that performing evaluations in limited segments, with breaks between each segment, may increase the effectiveness of the evaluator in identifying usability problems.

Faulkner, Laura. Usability Professionals Association (2005). Articles>Usability>Methods>Heuristic Evaluation

14.
#36522

Usability Expert Reviews: Beyond Heuristic Evaluation

Most people that carry out usability expert reviews use Jakob Nielsen's ten usability 'heuristics'. Many of these guidelines are common sense but they are not based on substantive research. The International usability standard, BS EN-ISO 9241-110 proposes an alternative set of seven guidelines. These guidelines have the benefit of international consensus and they can be applied to any interactive system.

Travis, David. UserFocus (2007). Articles>Usability>Methods>Heuristic Evaluation

15.
#35651

Usability Testing Versus Expert Reviews

In this Ask UXmatters column—which is the first in a series of three columns focusing on usability—our experts discuss the use of usability testing versus expert reviews.

Six, Janet M. UXmatters (2009). Articles>Usability>Testing>Heuristic Evaluation

Follow us on: TwitterFacebookRSSPost about us on: TwitterFacebookDeliciousRSSStumbleUpon