Wednesday, October 13, 2010

20101013 - Hu, ...Quantitative Usability Requirements Spec & Usability Evaluation..

A full life-cycle methodology for structured use-centered quantitative usability requirements specification and usability evaluation of websites

by Hu, Guoqiang, Ph.D., Auburn University, 2009 , 201 pages; AAT 3386203



My Interest:

1) QUEST – Hu's usability evaluation method.

2) Expert usability review.

3) User usability testing.

4) SUS – System Usability Scale.

5) Usability metrics (usability criteria) of QUEST.


Action:

To search for his dissertation or journal article. Want to read more.



Background


World Wide Web has gained its dominant status in the cyber information and services delivery world in recent years. But how to specify website usability requirements and how to evaluate and improve website usability according to its usability requirements specification are still big issues to all the stakeholders.


Research Goal


To help solve this problem, we propose a website usability requirements specification and usability evaluation methodology that features a structured use-centered quantitative full life-cycle method.


Methodology


A validation experiment has been designed and conducted to prove the validity of the proposed methodology, QUEST (Quantitative Usability Equations SeT). Its principle is to prove that QUEST has stronger website usability evaluation capability than the most typical existing usability evaluation methods. Apparently, if QUEST's website usability evaluation capability is established, then its usability metrics can be used to quantitatively specify upfront user usability requirements for websites.


In the validation experiment, 7 usability experts and 20 student subjects were recruited to perform 4 tasks on 2 open source calendar websites, WebCalendar 1.0.5 and VCalendar 1.5.3.1; 4 sets of usability data had been collected, which were corresponding to the following 4 usability evaluation methods respectively: expert usability review, traditional user usability testing, SUS (System Usability Scale), and QUEST.


Comments: He should also compare against other popular usability evaluation methods such as heuristic evaluation, and cognitive walkthrough. He should also compare against other more popular Usability Questionnaires, such as QUIS and SUMI. Noted that a good method has been compared, i.e. user usability testing. Generally, he has not done a FAIR comparison.


Results Discussion


According to the experiment results: both the expert usability review and the traditional user usability testing were inconclusive on which of the 2 target websites had better usability; although SUS rated the overall usability of WebCalendar 1.0.5 at 66.00 and VCalendar 1.5.3.1 at 61.75, it was subjective and vague on usability problems; in contrast, QUEST not only rated the overall usability of WebCalendar 1.0.5 at 56.59 and VCalendar 1.5.3.1 at 35.97, but also revealed where the usability problems were and how severe each usability problem was in a quantitative manner.


Comments: After reading Hu's abstract, an idea came to me. Initially, I have thought of selecting the type/hybrid of UEM based on literature review and comparison. Benchmarking on Hu, I could also do a study to compare various UEM and select the UEM type/hybrid based on the results of study.


Conclusion


In conclusion, it clearly can be stated that QUEST has stronger website usability evaluation capability than all other 3 most typical existing usability evaluation methods. So, the proposed methodology has been validated by the experiment results.



Note: No Preview nor Full-Text dissertation is available for download.

1 comment:

  1. Guoqiang Hu and Kai H. Chang.
    A Methodology for Structured Use-centered Quantitative Full-life-cycle Usability Requirements Specification & Usability Evaluation of Web Sites.

    :)
    Managed to search & download this journal article...which I think is based on Hu's PhD dissertation.

    ReplyDelete