Nielsen's Heuristic Evaluation Methodology: Before the Ten Heuristic Principles
Part 7 - blogging of : Usability Inspection Criteria of e-Learning Portals.
Smith and Mosier (1986), as quoted by Nielsen and Molich (1990a) stated that “…collections of usability guidelines have on the order of one thousand rules to follow, however, and are therefore seen as intimidating by developers. Most people probably perform heuristic evaluation on the basis of their own intuition and common sense instead.”
Rolf Molick and Jakob Nielsen made a smaller set of heuristic guideline for practical heuristic evaluation.
The Nine Usability Heuristics were:
· Simple and natural dialogue
· Speak the user’s language
· Minimize user memory load
· Be consistent
· Provide feedback
· Provide clearly marked exits
· Provide shortcuts
· Good error messages
· Prevent errors. (Molich & Nielsen, 1990; Nielsen & Molich, 1990a)
This list of nine heuristic principles were developed during Nielsen’s and Molich’s many years of teaching and consulting experience in the field of usability engineering. (Nielsen & Molich, 1989; Nielsen & Molich, 1990a)
Nielsen (1994a) explained about the research of 249 usability problems which were detected from 11 earlier projects. Out of the 11 projects, 7 were heuristic evaluation and 4 were user testing projects. He compiled 101 usability heuristics and rated each of them as related to each of 249 usability problems. He did a factor analysis and selected 7 most significant factors.
Table 2.2: 101 Usability Heuristics. Taken from Nielsen (1994b). Enhancing the Explanatory Power of Usability Heuristics.
Based on Factor Analysis,
· 7 most significant factors were selected.
· All the 7 usability heuristics (see Table 2.3) are sorted and ranked according its significance (% value).
· One evaluator by using these 7 usability heuristics should be able to identify about 30.4% of all usability problems (see Figure 2.1). In my opinion, Nielsen may have felt that 30.4% was not sufficient.
Table 2.3: Ranking of the 7 most significant Factors (result of Factor Analysis). Taken from: Nielsen (1994a), Enhancing the Explanatory Power of Usability Heuristics.
Figure 2.1: Pareto Chart showing the 7 most significant Factors (resulting from Factor Analysis). Chart drawing using data from: Nielsen (1994a), Enhancing the Explanatory Power of Usability Heuristics.
Molich and Nielsen (1990) were popular with their Nine Heuristics for heuristic evaluation. Nielsen (1994b) added in Factor 8 and Factor 9 which were “Aesthetic and minimalist design” and “Help users recognise, diagnose and recover from errors.” Hence, the revised Nine Heuristics were established.
With this Nine Heuristics, a single evaluator could identify about 34.4% of all usability problems.
Figure 2.2: Pareto Chart showing the 10 most significant Factors (resulting from Factor Analysis). Chart drawing using data from: Nielsen (1994a), Enhancing the Explanatory Power of Usability Heuristics.
Another usability heuristic that was deemed important, “Help and documentation”, was subsequently added in to be the 10th heuristics. (Nielsen, 1994b; Nielsen, 2005a) With this Ten Heuristics, a single evaluator could identify about 36.4% of all usability problems (see Figure 2.2).
Due to this rather low potential achievement (36.4% only), it is important to use multiple evaluators doing heuristic evaluation on individual basis.
Nielsen's Heuristic Evaluation Methodology: Ten Heuristics Principles
Ten Usability Heuristics for heuristic evaluation are:-
“1 Visibility of system status
The system should always keep users informed about what is going on, through appropriate feedback within reasonable time.
2 Match between system and the real world
The system should speak the users' language, with words, phrases and concepts familiar to the user, rather than system-oriented terms. Follow real-world conventions, making information appear in a natural and logical order.
3 User control and freedom
Users often choose system functions by mistake and will need a clearly marked "emergency exit" to leave the unwanted state without having to go through an extended dialogue. Support undo and redo.
4 Consistency and standards
Users should not have to wonder whether different words, situations, or actions mean the same thing. Follow platform conventions.
5 Error prevention
Even better than good error messages is a careful design which prevents a problem from occurring in the first place. Either eliminate error-prone conditions or check for them and present users with a confirmation option before they commit to the action.
6 Recognition rather than recall
Minimize the user's memory load by making objects, actions, and options visible. The user should not have to remember information from one part of the dialogue to another. Instructions for use of the system should be visible or easily retrievable whenever appropriate.
7 Flexibility and efficiency of use
Accelerators -- unseen by the novice user -- may often speed up the interaction for the expert user such that the system can cater to both inexperienced and experienced users. Allow users to tailor frequent actions.
8 Aesthetic and minimalist design
Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility.
9 Help users recognize, diagnose, and recover from errors
Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.
10 Help and documentation
Even though it is better if the system can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user's task, list concrete steps to be carried out, and not be too large.” (Nielsen, 1994b; Nielsen, 2005a)
source:
Usability Inspection Criteria for e-Learning Portals.
TeckChong Yeap.
MDP7515 PROJECT
Dissertation submitted in partial fulfilment of the requirement for the degree of
Master of Multimedia (E-Learning Technologies)
MULTIMEDIA UNIVERSITY
MALAYSIA
October 2008.
References:
Molich, R., and Nielsen, J. (1990). Improving a Human-Computer Dialogue: What designers know about traditional interface design. Communication of the ACM. (March 1990).
Nielsen, J., and Molich, R. (1989). Teaching User Interface Design based on Usability Engineering. ACM SIGCHI Bulletin 21, 1 (July 1989), 45-48.
Nielsen, J., and Molich, R. (1990a). Heuristic Evaluation of User Interfaces. Proceedings ACM CHI'90 Conf. (Seattle, WA, 1 -5 April), 249-256.
Nielsen, J. (1992a). Finding Usability Problems Through Heuristic Evaluation. Proceedings ACM CHI'92 Conference (Monterey, CA, May 3 -7), 373-380.
Nielsen, J. (1992b). The Usability Engineering Life Cycle. IEEE Computer 25, 3 (March 1992), 12-22. Retrieved Jan 1, 2006 from http://web.njit.edu/~jerry/CIS-677/Articles/Nielsen-IEEE-1992.pdf
Nielsen, J. and Landauer, T.K. (1993a). A Mathematical Model of the Finding of Usability Problems. Proceeding ACM INTERCHI’93 (April 24-29, 1993), 206-213.
Nielsen, J. and Phillips, V.L. (1993b). Estimating the Relative Usability of Two Interfaces: Heuristic, Formal, and Empicircal Methods Compared. Proceeding ACM INTERCHI’93 (April 24-29, 1993), 214-221.
Nielsen, J. (1994a). Enhancing the Explanatory Power of Usability Heuristics. Proceeding ACM CHI'94 Conf. (Boston, MA, April 24-28), 152-158.
Nielsen, J. (1994b). Heuristic Evaluation. In Nielsen, J. and Mack, R.L. (eds), Usability Inpection Methods. John Wiley & Sons, New York, NY.
Nielsen, J. and Mack, R.L. (eds)(1994). Usability Inspection Methods. John Wiley & Sons, New York, NY.
Nielsen, J. (1995a). Technology Transfer of Heuristic Evaluation and Usability Inspection. IFIP INTERACT'95 International Conference on Human-Computer Interaction (Lillehammer, Norway, June 27, 1995). Retrieved Dec 23, 2005 from http://www.useit.com/papers/heuristic/learning_inspection.html
Nielsen, J. (1995b). Usability Inspection Methods. Proceeding ACM CHI ’95. Retrieved Dec 23, 2005 from http://sigchi.org/chi95/Electronic/documnts/tutors/jn_bdy.htm
Nielsen, J. (2000a). Why You Need To Test With 5 Users. Jakob Nielsen’s Alerbox, March 19,2000. Retrieved Dec 23, 2005 from http://www.useit.com/alertbox/20000319.html
Nielsen, J. (2003). Usability 101: Introduction to Usability. Jakob Nielsen’s Alertbox, Aug 25, 2003. Retrieved Jan 1, 2006 from http://www.useit.com/alertbox/20030825.html
Nielsen, J. (2005a). Ten Usability Heuristics. Retrieved Dec 23, 2005 from http://www.useit.com/papers/heuristic/heuristic_list.html
Nielsen, J. (2005b). How to Conduct a Heuristic Evaluation. Retrieved Dec 23, 2005 from http://www.useit.com/papers/heuristic/heuristic_evaluation.html
Nielsen, J. (2005c). Characteristics of Usability Problems Found by Heuristic Evaluation. Retrieved Jan 1, 2006 from http://www.useit.com/papers/heuristic/usability_problems.html
Nielsen, J. (2005d). Severity Ratings for Usability Problems. Retrieved Dec 23, 2005 from http://www.useit.com/papers/heuristic/severityrating.html
Nielsen, J. (2005e). Summary of Usability Inspection Methods. Retrieved Dec 23, 2005 from http://www.useit.com/papers/heuristic/inspection_summary.html
Sunday, September 6, 2009
Sep 7 - Yeap, Nielsen's Heuristic Evaluation: Before the Ten Heuristic Principles (part 7 of UIC for eLearn)
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment