A directory of resources inthe field of technical communication.

Usability>Methods

261 found. Page 1 of 11.

About this Site | Advanced Search | Localization | Site Maps
 

1 2 3 4 5 6 7 8 9 10 11  NEXT PAGE »

 

1.
#10318

Accentuate the Negative: Obtaining Effective Reviews Through Focused Questions   (peer-reviewed)   (members only)

How you ask a question strongly determines the type of answer that you will obtain. For effective documentation reviews, whether they are conducted internally or externally as part of usability testing, it's important to use precise questions that will provide concrete information on which to base revisions. This paper proposes an approach to obtaining useful feedback that emphasizes negative, 'what did we do wrong?' questions. This approach focuses limited resources on areas that need improvement rather than areas that already work well and that don't require immediate improvement.

Hart, Geoffrey J.S. Technical Communication Online (1997). Articles>Usability>Methods>Testing

2.
#38336

Accuracy vs. Insights in Quantitative Usability

Better to accept a wider margin of error in usability metrics than to spend the entire budget learning too few things with extreme precision.

Nielsen, Jakob. Alertbox (2011). Articles>Usability>Testing>Methods

3.
#35936

Adopting Documentation Usability Techniques to Alleviate Cognitive Friction

Cognitive friction results in a digital divide between the software development community and software users. The digital divide, in turn, has a direct correlation with the usability of the application: how well can the software users learn and use the application or the product to perform their tasks and accomplish their goals. Today's Technical Communicators can help bridge this divide and reduce cognitive friction by applying industry-acclaimed usability techniques to the documentation they produce toward accelerating user acceptance of the product. Less cognitive friction means better user adoption that results in fewer calls to tech support, higher customer satisfaction, and in the long run, better brand loyalty.

Biswas, Debarshi Gupta and Suranjana Dasgupta. Indus (2009). Articles>Documentation>Usability>Methods

4.
#18220

Advanced Issues in Usability: Balancing User Preference and Performance Data Collection   (PDF)

The purpose of this paper is to provide a little background on my position for the progression on usability issues. I’ll present what measures I typically collect, and the differences between performance and preference data. Having this as a starting place may help us to have a useful progression discussion.

Rauch, Thyra L. STC Proceedings (1996). Presentations>Usability>Methods

5.
#33301

Analyse Context of Use

Who are the intended offsiteuser and what are their offsitetask? (Why will they use the system? What is their experience and expertise?) What are the offsitetechnical and offsiteenvironmental constraints? (What types of hardware will be used in what organisational, technical and physical environments?)

UsabilityNet (2006). Articles>Usability>Methods>Contextual Inquiry

6.
#21396

Analyzing Card Sort Results with a Spreadsheet Template  (link broken)

This article explains how to quickly derive easily-read, quantitative results from a card-sort activity by entering data into a spreadsheet template that is adaptable to any set of cards and categories.

Lamantia, Joe. Boxes and Arrows (2003). Articles>Usability>Methods>Card Sorting

7.
#31652

Analyzing the Interaction Between Facilitator and Participants in Two Variants of the Think-Aloud Method   (PDF)   (members only)

This paper focuses on the interaction between test participants and test facilitator in two variants of the think-aloud method. In a first, explorative study, we analyzed think-aloud transcripts from two usability tests: a concurrent think-aloud test and a constructive interaction test. The results of our analysis show that while the participants in both studies never explicitly addressed the facilitator, the think-aloud participants showed more signs of awareness of the facilitator than the participants in the constructive interaction test. This finding may have practical implications for the validity of the two methods.

van den Haak, Maaike J. and Menno D.T. de Jong. IEEE PCS (2005). Articles>Usability>Testing>Methods

8.
#13967

Anthropologists Go Native in the Corporate Village

Anthropologist Elizabeth Briody earned her PhD studying communities of Mexican-American farm workers and Catholic nuns. For the past 11 years, though, she's been studying a different community -- the men and women of General Motors. As GM's 'industrial anthropologist,' Briody explores the intricacies of life at the company. It's not all that different from her previous work. 'Anthropologists help elicit the cultural patterns of an organization,' she says. 'What rules do people have about appropriate and inappropriate behavior? How do they learn those rules and pass them on to others?' Briody is a pioneer in a growing and influential field -- corporate anthropology. What began as an experiment in a handful of companies such as GM has become an explosion. In recent years, some of the biggest names in business have recruited highly trained anthropologists to understand their workers and customers better, and to help design products that better reflect emerging cultural trends. These companies are convinced that the tools of ethnographic research -- minute observation, subtle interviewing, systematic documentation -- can answer questions about organizations and markets that traditional research tools can't.

Kane, Kate A. Fast Company (1996). Articles>Usability>Methods>Contextual Inquiry

9.
#26643

Archiving Usability Reports

Most usability practitioners don't derive full value from their user tests because they don't systematically archive the reports. An intranet-based usability archive offers four substantial benefits.

Nielsen, Jakob. Alertbox (2005). Articles>Usability>Methods

10.
#26919

The Art of Usability Benchmarking

One common concern raised by managers and engineers alike is this: how easy to use is enough? This question, and the absence of an easy answer, is often the first defense people offer against investing in usability and ease of use. The smart usability engineer or designer has at least one response: the usability benchmark. By capturing the current level of ease of use of the current product or website, a reference point is created that can be measured against in the future. It doesn't answer the question of how usable is enough, but if the benchmark is done properly, it does enable someone to set goals and expectations around ease of use for the future.

Berkun, Scott. ScottBerkun.com (2006). Articles>Usability>Methods

11.
#37884

Asking Questions About Internet Behavior :: UXmatters

What are we to do if we really need, during usability testing, to get some sort of handle on Internet experience? Perhaps for comparison across usability test sessions or for measuring progress in some way?

Jarrett, Caroline. UXmatters (2011). Articles>Usability>Testing>Methods

12.
#21012

Avoiding Bias from the Survivor Effect

Only a few of the survey sites we analyzed in 2000 are still around. We can safely assume that the surviving sites are not a random sample of the original group, but rather that significant differences exist between the sites that made it and those that died. Survival might be due partly to luck, but it is mainly a result of good management and an understanding of Internet fundamentals. Thus, the surviving sites are likely to be disproportionately clued-in about what it takes to run an online business.

Nielsen, Jakob. Alertbox (2002). Articles>Usability>Methods>Web Design

13.
#29296

Balancing the 5Es: Usability   (PDF)

Just what do we mean by usability? Before we can set out to achieve it, we need to understand what it is we are trying to achieve. It's not enough to declare that from here on, our software will be more user friendly or that we will now be customer focused.

Quesenbery, Whitney. Cutter IT Journal (2004). Articles>Usability>Methods

14.
#20928

Being User-Centered When Implementing a UCD Process

For those who are interested in usability – whether long-time advocates or newly introduced – this is a good time to introduce a user-centered design process.

Quesenbery, Whitney. WQusability (2001). Articles>User Centered Design>Methods>Usability

15.
#34460

The Benefits of Viewing User Tests

The benefits of user testing have long been established. It is still important however to try and maximise these benefits. One way in which this can be done is by viewing the user test yourself.

Frontend Infocentre (2009). Articles>Usability>Testing>Methods

16.
#27596

The Best of Eyetrack III: What We Saw When We Looked Through Their Eyes

In Eyetrack III, we observed 46 people for one hour as their eyes followed mock news websites and real multimedia content. In this article we'll provide an overview of what we observed.

Outing, Steve and Laura Ruel. Eyetrack III. Articles>Usability>Methods>Eye Tracking

17.
#26091

Beyond the Focus Group

Focus groups are popular amongst marketing professionals for good reason. They are relatively quick to organise and the feedback is instantaneous. A wide range of views can be assembled from people from a wide range of backgrounds. When focus groups go well, the data can be extremely useful in identifying profitable design routes. Plus any technique that gets companies closer to their customers can't be all bad.

System Concepts (2005). Articles>Usability>Methods>Focus Groups

18.
#19290

Beyond Usability Testing

Usability testing is a powerful tool in identifying problems and issues that users may have with a website or software application. But for all its benefits, traditional testing does not necessarily give a complete picture at how effective a site or application is in terms of meeting business goals.

Farrell, Tom. Frontend Infocentre (2001). Articles>Usability>Testing>Methods

19.
#28642

Brainstorming

Brainstorming is an individual or group process for generating alternative ideas or solutions for a specific topic. Good brainstorming focuses on the quantity and creativity of ideas: the quality of ideas is much less important than the sheer quantity. After ideas are generated, they are often grouped into categories and prioritized for subsequent research or application.

Usability Body of Knowledge (2007). Articles>Usability>Methods>Collaboration

20.
#28355

Bring Your Personas to Life!

Method acting can take your personas from the page to the stage. Think beyond traditional practice to give emotional life to your personas.

Fugaz, Zef. Boxes and Arrows (2006). Articles>Usability>Methods>Personas

21.
#21274

Bringing Your Personas to Life in Real Life

The way you communicate the personas and present your deliverables is key to ensuring consistency of vision. Without that consistency, you'll spend far too much time arguing with your colleagues about who your users are rather than how to meet their needs.

Freydenson, Elan. Boxes and Arrows (2002). Articles>Usability>Methods>Personas

22.
#25931

Building Effective Customer Surveys

Well-designed customer surveys can yield valuable information for your business. Unfortunately, though, a poorly worded survey can set you marching off in exactly the wrong direction. Below are some tips on designing surveys to get reliable, useful data.

Bennaco (2005). Articles>Usability>Methods>Surveys

23.
#19916

Building Usability in from the Beginning: Analyzing Users and Their Tasks   (PDF)

In this interactive session, attendees will practice their skills in interviewing users, creating task scenarios from the users’ perspective, and turning the task scenarios into designs for information products.

Hackos, JoAnn T. and Janice C. 'Ginny' Redish. STC Proceedings (1996). Articles>Usability>Methods

24.
#36494

Card Games for Information Architects

This article reviews 6 simple but powerful research techniques you can use to improve the information architecture of your product or web site. None of these activities requires a computer. You simply need a bunch of cards, a participant and a desk.

Travis, David. UserFocus (2010). Articles>Usability>Methods>Card Sorting

25.
#33137

Card Sorting

This is a method for discovering the latent structure in an unsorted list of statements or ideas. The investigator writes each statement on a small index card and requests six or more informants to sort these cards into groups or clusters, working on their own. The results of the individual sorts are then combined and if necessary analysed statistically.

UsabilityNet (2006). Articles>Usability>Methods>Card Sorting

 
 NEXT PAGE »

 

Follow us on: TwitterFacebookRSSPost about us on: TwitterFacebookDeliciousRSSStumbleUpon