Purdue Usability Testing Questionnaire (PTUQ)
Please rate the usability of the system.
Try to respond to all the items.
For items that are not applicable, use: NA
Likert scale:
1 = bad
2
3
4
5
6
7 = good
NA
1. COMPATIBILITY
1. Is the control of cursor compatible with movement?
2. Are the results of control entry compatible with user expectations?
3. Is the control matched to user skill?
4. Are the coding compatible with familiar conventions?
5. Is the wording familiar?
2. CONSISTENCY
6. Is the assignment of colour codes conventional?
7. Is the coding consistent across displays, menu options?
8. Is the cursor placement consistent?
9. Is the display format consistent?
10. Is the feedback consistent?
11. Is the format within data fields consistent?
12. Is the label format consistent?
13. Is the label location consistent?
14. Is the labelling itself consistent?
15. Is the display orientation consistent? -- panning vs. scrolling.
16. Are the user actions required consistent?
17. Is the wording consistent across displays?
18. Is the data display consistent with entry requirements?
19. Is the data display consistent with user conventions?
20. Are symbols for graphic data standard?
21. Is the option wording consistent with command language?
22. Is the wording consistent with user guidance?
3. FLEXIBILITY
23. Does it have by-passing menu selection with command entry?
24. Does it have direct manipulation capability?
25. Is the design for data entry flexible?
26. Can the display be controlled by user flexibly?
27. Does it provide flexible sequence control?
28. Does it provide flexible user guidance?
29. Are the menu options dependent on context?
30. Can user name displays and elements according to their needs?
31. Does it provide good training for different users?
32. Are users allowed to customize windows?
33. Can users assign command names?
34. Does it provide user selection of data for display?
35. Does it handle user-specified windows?
36. Does it provide zooming for display expansion?
4. LEARNABILITY
37. Does it provide clarity of wording?
38. Is the data grouping reasonable for easy learning?
39. Is the command language layered?
40. Is the grouping of menu options logical?
41. Is the ordering of menu options logical?
42. Are the command names meaningful?
43. Does it provide no-penalty learning?
5. MINIMAL ACTION
44. Does it provide combined entry of related data?
45. Will the required data be entered only once?
46. Does it provide default values?
47. Is the shifting among windows easy?
48. Does it provide function keys for frequent control entries?
49. Does it provide global search and replace capability?
50. Is the menu selection by pointing? -- primary means of sequence control.
51. Is the menu selection by keyed entry? -- secondary means of control entry.
52. Does it require minimal cursor positioning?
53. Does it require minimal steps in sequential menu selection?
54. Does it require minimal user control actions?
55. Is the return to higher-level menus required only one simple key action?
56. Is the return to general menu required only one simple key action?
6. MINIMAL MEMORY LOAD
57. How are abbreviations and acronyms used?
58. Does it provide aids for entering hierarchic data?
59. Is the guidance information always available?
60. Does it provide hierarchic menus for sequential selection?
61. Are selected data highlighted?
62. Does it provide index of commands?
63. Does it provide index of data?
64. Does it indicate current position in menu structure?
65. Are data items kept short?
66. Are the letter codes for menu selection designed carefully?
67. Are long data items partitioned?
68. Are prior answers recapitulated?
69. Are upper and lower case equivalent?
70. Does it use short codes rather than long ones?
71. Does it provide supplementary verbal labels for icons?
7. PERCEPTUAL LIMITATION
72. Does it provide coding by data category?
73. Is the abbreviation distinctive?
74. Is the cursor distinctive?
75. Are display elements distinctive?
76. Is the format for user guidance distinctive?
77. Do the commands have distinctive meanings?
78. Is the spelling distinctive for commands?
79. Does it provide easily distinguished colours?
80. Is the active window indicated?
81. Are items paired for direct comparison?
82. Is the number of spoken messages limited?
83. Does it provide lists for related items?
84. Are menus distinct from other displayed information?
85. Is the colour coding redundant?
86. Does it provide visually distinctive data fields?
87. Are groups of information demarcated?
88. Is the screen density reasonable?
8. USER GUIDANCE
89. System feedback: How helpful is the error message?
90. Does it provide CANCEL option?
91. Are erroneous entries displayed?
92. Does it provide explicit entry of corrections?
93. Does it provide feedback for control entries?
94. Is HELP provided?
95. Is completion of processing indicated?
96. Are repeated errors indicated?
97. Are error messages non-disruptive/informative?
98. Does it provide RESTART option?
99. Does it provide UNDO to reverse control actions?
100. Is the sequence control user initiated?
List the most negative aspect(s):
1 ......................................................
2 ......................................................
3 ......................................................
List the most positive aspect(s):
1 ......................................................
2 ......................................................
3 ......................................................
Purdue Usability Testing Questionnaire
Based on: Lin, H.X. Choong, Y.-Y., and Salvendy, G. (1997) A Proposed Index of Usability: A Method for Comparing the Relative Usability of Different Software Systems. Behaviour & Information Technology, 16:4/5, 267-278. Abstract
(The ratings of importance are not included in this online version, but could be incorporated into the comments.)
Source: http://hcibib.org/perlman/question.cgi?form=PUTQ
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment