Validating the User-Centered Hybrid Assessment Tool (User-CHAT): A comparative usability evaluation by Elgin, Peter D., Ph.D., Kansas State University, 2007 , 168 pages; My Interest: (1) User-CHAT = UET. (2) Hybrid/combination of user testing, heuristic evaluation and cognitive walkthrough. (3) Effectiveness score. (4) Thoroughness, Validity. Action: To read specific parts of Dissertation in future. Therefore the present study had two principal objectives: (1) to validate a hybrid usability technique that identifies important and ignores inconsequential usability problems, and (2) to provide empirical performance data for several usability protocols on a variety of contemporary comparative metrics. The User-C entered H ybrid A ssessment T ool (User-CHAT) was developed to maximize efficient diagnosis of usability issues from a behaviorally-based perspective while minimizing time and resource limitations typically associated with usability assessment environments. Several characteristics of user-testing, the heuristic evaluation, and the cognitive walkthrough were combined to create the User-CHAT. Comments: Use-CHAT is UET. UEM used is hybrid of user testing, heuristic evaluation and cognitive walkthrough. This may be good reference/benchmark for my research. Two techniques generated comparison lists of usability problems---user-testing data and various inclusion criteria for usability problems identified by the User-CHAT, heuristic evaluation, and cognitive walkthrough. Overall the results demonstrated that the User-CHAT attained higher effectiveness scores than the heuristic evaluation and cognitive walkthrough, suggesting that it helped evaluators identify many usability problems that actually impact users, i.e., higher thoroughness, while attenuating time and effort on issues that were not important, i.e., higher validity. Comments: I want to read about how he score & analysed Effectiveness, Thoroughness and Validity. Furthermore, the User-CHAT had the greatest proportion of usability problems that were rated as serious, i.e., usability issues that hinder performance and compromise safety. Comment: How many evaluators? Who were the evaluators? How was usability problems rated – serious, what else? The User-CHAT's performance suggests that it is an appropriate usability technique to implement into the product development lifecycle. Chapter 1 -- Introduction Certification Environment Constraints Certification and Usability Usability Evaluation Methods User-testing Cognitive Walkthrough Heuristic Evaluation UEM Characteristics Incorporated into the User-CHAT Development of the User-CHAT Comparative Usability Evaluation UEM Performance Metrics * Thoroughness * Validity * Effectiveness * Reliability . * Time per Usability Problem Hypotheses. Chapter 2 -- Method Participants The System & Benchmark Tasks Procedure * User-Testing * User-CHAT. * Heuristic Evaluation * Cognitive Walkthrough Severity Ratings Summary of Procedure and Data Collection Data Synthesis.. Classifying Usability Problems into Heuristics Determining Unique or Shared Usability Problems Chapter 3 -- Results Time Spent per Usability Problem Usability Problem Partitions Detection Rates Thoroughness, Validity, and Effectiveness Reliability Severity Ratings Heuristics Classifications Generating a Comparison List without User-testing
|
Tuesday, September 21, 2010
20100922 - Elgin, Validating the User-CHAT (UET)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment