A directory of resources inthe field of technical communication.

Assessment

488 found. Page 1 of 20.

About this Site | Advanced Search | Localization | Site Maps
 

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20  NEXT PAGE »

 

1.
#31559

Accountability and Return-On-Investment

Once viewed more as art than science, marketers are increasingly interested in measuring performance. Like it or not, there is a new wave of accountability in the world of marketing, and if you're not prepared, you could get swept under it. Companies are becoming increasingly concerned with ensuring that all activities are profitable. As a result, each dollar invested in marketing is being challenged to demonstrate bottom line performance. New forms of marketing, escalating ad costs and tools that purport to measure marketing effectiveness have all contributed to the pressure traditional media is facing to "prove its worth."

Watrall, Rick. Communication World Bulletin (2003). Articles>Business Communication>Marketing>Assessment

2.
#24673

Accountable Assessment in the Age of Digital Labor   (peer-reviewed)

Entrepreneurship is THE economic mode of the digital age and entrepreneurship is defined by risk. Students who will become workers must be comfortable, even engaged by, risk-taking.

Glaros, Michelle. Kairos (2001). Articles>Education>Assessment>Online

3.
#29449

The Achilles Heel of Product Design Competitions and the Fair Judging Solution

I have judged a fair number of national and international product design competitions (five in the past three years alone) and each has made the same procedural mistake: products are assembled and categorized, judging criterion are devised, reputable judges are assembled, and yet we judges never see or touch the products in person. Instead, we receive a set of written documents describing each product, its intended function, and its design process. Imagine an art contest conducted by email and you get the gist of what's going on out there.

Buttiglieri, Rich. Usability Professionals Association (2007). Design>Usability>Assessment

4.
#32438

Acid Redux

I fully acknowledge that a whole lot of very clever thinking went into the construction of Acid3 (as was true of Acid2), and that a lot of very smart people have worked very hard to pass it. Congratulations all around, really. I just can’t help feeling like some broader and more important point has been missed.

Meyer, Eric. MeyerWeb (2008). Articles>Web Design>Standards>Assessment

5.
#19129

Adding Value as a Professional Technical Communicator   (peer-reviewed)   (members only)

Value added means generating greater return on investment than the cost of the initial investment.

Redish, Janice C. 'Ginny'. Technical Communication Online (1995). Articles>TC>Assessment

6.
#37592

Aligning UX Issues’ Levels of Severity with Business Objectives

Over the past several years, I’ve grown increasingly dissatisfied with the vague and somewhat solipsistic nature of the gradations UX professionals typically use to describe the severity of usability issues. High, medium, and low don’t begin to sufficiently explain the potential brand and business impacts usability issues can have.

Sherman, Paul J. UXmatters (2010). Articles>User Experience>Assessment>Business Case

7.
#37621

Aligning UX Issues’ Levels of Severity with Business Objectives

Over the past several years, I’ve grown increasingly dissatisfied with the vague and somewhat solipsistic nature of the gradations UX professionals typically use to describe the severity of usability issues. High, medium, and low don’t begin to sufficiently explain the potential brand and business impacts usability issues can have. After incrementally iterating on several existing classifications of severity, I finally decided in late 2008 to simply create some new ones, which I’ll present in this column. For lack of a better term, I call them business-aligned usability ratings.

Sherman, Paul J. UXmatters (2010). Articles>User Experience>Usability>Assessment

8.
#31410

Alternative Ways to Measure the Effectiveness of Your Publications

If you want to go beyond the usual limits of a traditional readership survey that tells you how well received a publication is, first clarify your objectives. Then you might include additional "impact" questions on your next survey, conduct in-depth focus groups with readers, and conduct some objective, "audience-free" measurements of the publication to see how well those objectives were met.

Sinickas, Angela D. Sinickas Communications (1998). Articles>Management>Communication>Assessment

9.
#35451

Analysis of Team Design Review

Every other team meeting, three team members get 30 minutes each to talk about projects they are working on, and they get to demonstrate some of the cool things they are integrating into the project. As a team, we look at the project and both learn from what they’ve done, and make suggestions on how they might improve the project.

Pehrson, Paul. Technically Speaking (2009). Articles>Collaboration>Graphic Design>Assessment

10.
#37982

Analysis of Web Content Delivered to a Mobile Computing Environment

The purpose of my research was to analyze web content delivered to the mobile computing environment with two goals in mind: first, to determine if the content lost contextual relevance in the mobile environment and, second, to see if a set of effective design principles could be applied towards the mobile environment. My research combines a literature review in conjunction with a survey that encompasses both quantitative and qualitative methods to analyze top-rated web sites in the mobile environment.

Perreault, Anthony. Xchanges (2009). Articles>Web Design>Mobile>Assessment

11.
#13745

Analyze Your Web Site Traffic

If you run a Web site, you're probably already thinking about tracking and analyzing the traffic it gets. Knowing how many pages are accessed, when, by whom, and for what purposes can mean the difference between simply having a Web site and building a sound Web strategy. Understanding how people use your site can help you--and your sales and marketing team--generate more traffic. If you can track your audience, learn which pages and resources are most popular, and identify technical problems and system bottlenecks, you can deliver a better experience. And that's the best way to keep people coming back to your site.

Aviram, Mariva H. Builder.com (1998). Articles>Usability>Assessment

12.
#14276

Analyzing an Organizational Web Site  (link broken)   (PDF)

The Web is still so new that there is very little consensus about what an organizational Web page should be and what purpose(s) it should serve. You will start this exercise by examining some organizational Web sites (preferably organizations in your field). You will develop criteria by which to judge organizational sites, and then use those criteria to evaluate a single Web site, with the site’s creator as your audience. Your criteria will doubtless include elements like the elegance of the design and should certainly include the navigational system and other Web page practicalities. They should also include the fundamentals that are important in all technical documents: suitability to purpose(s) and audience(s), content, organization, and tone.

Burnett, Rebecca E. Thomson (2001). Academic>Course Materials>Web Design>Assessment

13.
#37161

Anatomy 201: Type Measurements

When discussing or working with type, it’s not only important to understand the anatomy of the parts of letterforms, but it’s also important to understand how type is measured. We’re accustomed to measuring things in inches, yards, or miles, or, heaven forbid, the metric system. Type, on the other hand, has its own system of measurement of which most of us have a vague understanding. For example, most of us understand that normal body text is set between 10 and 12 points, and 72 points is much too large for everyday use. Few of us, however, really know what a point really is.

Opsteegh, Michael. Putting Your Best Font Forward (2009). Articles>Typography>Assessment

14.
#13524

Annual Awards for Contributions to the Field of Technical Communications   (peer-reviewed)

The ACM SIGDOC Executive Board welcomes letters of nomination for the SIGDOC Rigo and Diana Awards. The Rigo Award celebrates an individual's lifetime contribution to the field of information design, technical communication, and online information design; the Diana Award celebrates the contribution of an organization, institution, or business.

ACM SIGDOC. Academic>Research>Assessment

15.
#19011

Anvendelighed som Succeskriterie

Normalt arbejder man i en velkendt kontekst på sin lokale computer, hvad enten der er tale om Windows, Kde, RedHat, Mac os X m.fl. men når vi åbner døren til Internet bliver disse rutiner ødelagt af noget som ikke altid er til at sætte fingeren på. Hvad er det som gør webløsninger svære at arbejde med og finde rundt i? Når man første gang sætter sig foran en computer er det som oftest med et mål. Nysgerrighed, at komme på Internet og shoppe, at skrive et brev og mange andre ting. Oftest er det denne drivkraft som får os til at tage de første slidsomme uger med styresystemet som man langsomt kommer til at forstå, og som man på sigt bliver fortroligt med idet det er den platform som giver og adgang til alle de digitale oplevelser. Kan man ikke arbejde på platformen vil man med sikkerhed heller ikke kunne opnå sine mål med arbejdet.

Orgaard Larsen, Thomas. Quark, The (2002). (Danish) Design>Web Design>Assessment

16.
#24866

Appraising Technical Communicators   (PDF)

Appraisals based on objective performance criteria identify and measure the abilities and contributions of technical communicators. This workshop explores how to develop effective performance criteria, specific to technical communication, and how to use these criteria to evaluate performance and foster professional growth and development.

Gilbert, Catherine E. and Sharon A. Gambaro. STC Proceedings (1994). Careers>TC>Assessment

17.
#11774

Are Organizations Doing Enough to Improve Customer Satisfaction

Time-to-market pressure can diminish product testing time and quality. The results are product recalls, shoddy merchandise, and apologies by CEOs about poor quality. The consequence is the loss of consumer confidence. Don’t these companies realise that there’s no compromise on quality? I’m sure that these companies are ISO 9000 certified or have a Total Quality Management (TQM) program, so what is the problem? Perhaps the problem is not with ISO 9000 or TQM but with the way it is used.

Dick, David J. and Shelby Rosiak. Usability Interface (2000). Articles>Usability>Assessment>ISO 9000

18.
#37073

Are You Designing or Inspecting?

Guidelines are statements of direction. They’re about looking to the future and what you want to incorporate in the design. Guidelines are aspirational. Heuristics challenge a design with questions. The purpose of heuristics is to provide a way to “test” a design in the absence of data and primary observation by making an inspection. Heuristics are about enforcement. Both guidelines and heuristics are typically broad and interpretable. They’re built to apply to nearly any interface. But they come into play at different points in a design project.

Chisnell, Dana E. UX Magazine (2010). Articles>User Experience>Assessment

19.
#31404

Are You Spending the "Right" Amount?   (PDF)

To back up a request for more budgetor defend the existing one, you need to know exactly what you’re spending--and what you’re getting in return. But how can you tell if you’re spending too much on communication? This article suggests five approaches to weighing up the cost versus value of your communication activities.

Sinickas, Angela D. Sinickas Communications (2006). Articles>Management>Financial>Assessment

20.
#32843

Assessing Assessments: The Inequality of Electronic Testing

Computer and Internet based tests are used for a variety of purposes. From entering education or employment, to improving basic learning, people everywhere are taking electronically formatted tests. With the advancement of testing from traditional paper-based tests to technologically advanced electronic tests, people reap the benefits of easier access to tests, faster response times, and greater reliability and validity of tests. However, persons with disabilities are being left out of the picture and out of many typically-administered tests.

Lyman, Michael, Cyndi Rowland and Paul Bohman. WebAIM (2006). Articles>Web Design>Accessibility>Assessment

21.
#27720

Assessing Community Informatics: A Review of Methodological Approaches for Evaluating Community Networks and Community Technology Centers  (link broken)   (PDF)

This paper analyzes emerging community informatics evaluation literature to develop an understanding of indicators used to gauge project impacts in community networks and technology centers.

O'Neil, Dara. Georgia Institute of Technology (2002). Articles>Communication>Community Building>Assessment

22.
#36050

Assessing Excellence: Using Activity Theory to Understand Assessment Practices in Engineering Communication   (peer-reviewed)   (members only)

In the workplace, communication serves not as an end in itself, with features that are “good” or “bad,” but as a tool for mediating a range of professional activities, and effective documents are presentations are those that achieve their goals. Yet assessment methods in technical and professional communication often continue to rely on an evaluation of features apart from the intended work of the document. In this paper, we use activity theory as a lens to explore both the criteria for effective communication and the degree to which portfolio assessment methods can be applied to effectively assess student learning in this domain.

Paretti, Marie C. and Christine Bala Burgoyne. IEEE PCS (2009). Articles>Scientific Communication>Engineering>Assessment

23.
#19083

Assessing Existing Engineering Communication Programs: Lessons Learned from a Pilot Study   (peer-reviewed)

Increased support for greater accountability and assessment of engineering communication programs have led many schools of engineering and technology to initiate methods of assessing the quality of their students’ engineering communication abilities. In my institution, I have spearheaded the pilot year of such a program, and, as anticipated, have learned several valuable lessons that may be of interest to others interested in developing assessment procedures for engineering communication programs.

Rush Hovde, Marjorie. CPTSC Proceedings (2000). Academic>Education>Engineering>Assessment

24.
#14288

Assessing Proficiency in Engineering English   (PDF)   (peer-reviewed)   (members only)

Though engineers around the world conduct their work in nearly every language on the planet, there are very few who never use English for some aspect of their job. The largest professional engineering organizations use English as their primary language; most of the world’s engineering publications are written in English; and nearly all cooperative ventures with multinational participation choose English for their common language of communication. Unfortunately, most of the world’s engineers are not native speakers of English and thus are considerably disadvantaged in professional terms.

Orr, Thomas. IEEE Transactions on Professional Communication (2002). Articles>Language>Assessment

25.
#30144

Assessing Publications Process-Maturity: The Experiences of Two Organizations at Different Levels of Process Maturity  (link broken)   (PDF)

As Information Development organizations grow and mature, their organizational structure should grow and mature as well. The optimal structure for an organization in its early stages should focus on achieving stability and repeatable quality. As an organization matures, the optimal structure may need to be significantly different to develop a more thorough understanding of customers and contribute substantially to customer satisfaction.

Hackos, JoAnn T., Lisa Blaschke, Brenda MacKay and Deborah J. Rosenquist. STC Proceedings (1997). Articles>Information Design>Assessment>Case Studies

 
 NEXT PAGE »

 

Follow us on: TwitterFacebookRSSPost about us on: TwitterFacebookDeliciousRSSStumbleUpon