Saturday, September 5, 2009

Sep 5 - Howarth, Supporting Novice Usability Practitioners with Usability Engineering Tools (part 1)




Supporting Novice Usability Practitioners with Usability Engineering Tools.
Jonathan Randall Howarth.
Dissertation submitted to the faculty of the Virginia Polytechnic Institute and State University in partial fulfillment of the requirements for the degree of
Doctor of Philosophy in Computer Science and Applications.

April 13, 2007
Blacksburg, Virginia


Abstract
The usability of an application often plays an important role in determining its success. Accordingly, organizations that develop software have realized the need to integrate usability engineering into their development lifecycles. Although usability practitioners have successfully applied usability engineering processes to increase the usability of user-interaction designs, the literature suggests that usability practitioners experience a number of difficulties that negatively impact their effectiveness. These difficulties include identifying and recording critical usability data, understanding and relating usability data, and communicating usability information. These difficulties are particularly pronounced for novice usability practitioners.
With this dissertation, I explored approaches to address these difficulties through tool support for novice usability practitioners. Through an analysis of features provided by existing tools with respect to documented difficulties, I determined a set of desirable tool features including usability problem instance records, usability problem diagnosis, and a structured process for combining and associating usability problem data. I developed a usability engineering tool, the Data Collection, Analysis, and Reporting Tool (DCART), which contains these desirable tool features, and used it as a platform for studies of how these desirable features address the documented difficulties.
The results of the studies suggest that appropriate tool support can improve the effectiveness with which novice usability practitioners perform usability evaluations. More specifically, tool support for usability problem instance records helped novice usability practitioners more reliably identify and better describe instances of usability problems experienced by participants. Additionally, tool support for a structured process for combining and associating usability data
helped novice usability practitioners create usability evaluation reports that were of higher quality as rated by usability practitioners and developers.
The results highlight key contributions of this dissertation, showing how tools can support usability practitioners. They demonstrate the value of a structured process for transforming raw usability data into usability information based on usability problem instances. Additionally, they show that appropriate tool support is a mechanism for further integrating usability engineering into the overall software development lifecycle; tool support addresses the documented need for
more usability practitioners by helping novices perform more like experts.


INTRODUCTION

Motivation

Butler [1996] states that "usability has become a competitive necessity for the . . .success of software" (p. 59). Because of the growing awareness of its importance, organizations that produce software products have been expending resources for “doing usability”– building enviable usability laboratories, buying usability equipment, training developers in usability engineering (UE) methods [Hix & Hartson, 1993a], and conducting usability testing.
These investments have helped to make UE an important part of the overall software development lifecycle.
...usability practitioners experience a number of difficulties that negatively impact the effectiveness with which they are able to work, which in turn impacts the effectiveness of the UE process within which they work. These difficulties are particularly pronounced for novice usability practitioners.

Formative Usability Evaluations

During the usability evaluation sub-process, usability practitioners primarily conduct formative usability evaluations. As described in a Usability Professional’s Association workshop report [Theofanos et al., 2005], formative usability evaluations are conducted to “guide the improvement in design of future iterations” (p. 3). Usability practitioners conduct formative usability evaluations to understand the strengths and weaknesses of a given interaction design. During formative evaluations, usability practitioners collect a variety of qualitative data such as verbal protocol and subjective ratings with the goal of producing UP descriptions.

Summative usability studies represent a different type of usability evaluation that is typically performed after a product is released. Summative usability studies provide proof in the form of statistical significance that one given interaction design is better than other designs in specific ways. Usability practitioners may collect quantitative data such as measures of time on task and error counts that they later use in metrics.

Summative evaluations certainly have their value, but formative evaluations are the focus of this dissertation because of the emphasis on usability practitioners’ abilities to understand and critique the usability of an interaction design. Improving these abilities will help to increase the effectiveness with which they work in the usability evaluation sub-process.

Usability Engineering Tool Support

The focus of this dissertation is the use of software tools to support usability practitioners during formative usability evaluations. Appropriate tool support can provide a number of benefits including helping usability practitioners collect and analyze usability data and report usability problems in a structured and efficient manner.

Usability Practitioner Skill

The literature suggests that skill plays an important part in usability evaluation.
For example, a study by Nielsen [1992] found that usability specialists were better than non-specialists at using heuristic evaluation to evaluate an interface.
Also, in a study comparing the iterative development of designs by human factors specialists and programmers, Bailey [1993] concludes that “the training and background of designers can have a large effect on user interface design” (p. 204).

The focus of this dissertation is the use of usability engineering tools to support novice usability practitioners. I chose novice usability practitioners as the target audience because they can benefit most from appropriate tool support.
Experts typically have developed methods and strategies that work for them; although they may benefit from tool support, they do not require it.
Novice practitioners, on the other, may fail to recognize important usability data or interpret data incorrectly without the guidance and support that can be provided by a usability engineering tool.

Scope

...This dissertation is limited to the usability evaluation sub-process of the overall UE process. Within the usability evaluation sub-process, the focus is on formative usability evaluations that have the goal of producing usability evaluation reports. Within the context of formative usability evaluations, the focus is on tool support for UE, in particular the effectiveness of such tool support for novice usability practitioners.

Research Goals, Research Questions, and Approach

Table 1 shows the research goals, research questions, and steps of the
approach. Table 2 maps steps to mechanisms and principle outputs.
My Comments: Howarth's Research Goals, Research Questions and Approach are systematically written. Steps in his Approach were laid out clearly. Mechanisms and Principle Outputs (deliverables) were stated for each Step.

Table 1: Goals, questions, steps of the approach mapped to phases

Table 2: Research mechanisms, outputs, and completion dates by phase

No comments:

Post a Comment