Wednesday, October 13, 2010

20101013 - Ballard, Web site usability...educational web sites

Web site usability: A case study of student perceptions of educational web sites

by Ballard, Joyce Kimberly, Ph.D., University of Minnesota, 2010 , 408 pages; AAT 3408366



My Interest:

1) Nielsen's Heuristics.

2) Xerox Heuristic Evaluation Checklist.

3) Computer Use/User Self-Efficacy Scale.

4) Quality of Web-based Instruction.

5) Usability Testing – think-aloud protocol.

6) Usability Testing – eye tracking.

7) Usability Testing – time-error log.

8) Methodology of data analysis.

9) Theoretical framework.

10) Validity & reliability.


Action:

To read the Dissertation in future.



Research Goal


The purpose of this research study was to understand the construct of usability from the perspective of 74 students enrolled in six online courses offered by one online and distance learning program at a large, public university in the Midwest.


Methodology


Six courses, designed and developed by two different groups, professional and nonprofessional developers, were selected. The study used both quantitative and qualitative measures to record the experiences of students enrolled in the six online courses.


First, the courses were evaluated using Nielsen's (1994, 2000, 2002) heuristics as operationalized by the Xerox Heuristic Evaluation Checklist (1995) as a standard measure of usability, then rank-ordered by heuristic evaluation score.


Comments: I would love to see this Xerox Heuristic Evaluation Checklist. I suppose this is a UET (usability evaluation tool) for UEM = heuristic evaluation. Would love to know WHAT are the usability criteria (heuristics) used in this Checklist.


Eachus and Cassidy's (2006) Computer Use Self-efficacy Scale was used as a pre-course survey to measure students' computer self-efficacy prior to beginning their online course.


Comments: Would like to see the details of Computer Use/User Self-Efficacy Scale.


Stewart, Hong, and Strudler's (2004) Quality of Web-based Instruction was used as a post-course survey to measure student satisfaction with their online course experience.


A subset of 29 students participated in usability testing sessions in the usability lab. A think-aloud protocol provided qualitative data in the form of verbal reports, eye-tracking recordings provided data confirming the think-aloud protocol data, and a time-error log provided "time to complete tasks," and "error rate" data as students completed seven typical tasks required to successfully participate in an online course. A summary, debriefing interview with each student was conducted to record any additional student comments and any student recommendations for improving the courses.


Qualitative data were examined for themes and a coding scheme was created. This coding scheme, which illustrated the issues specific to educational web sites, was compared to Nielsen's (1994, 2000, 2002) heuristics to evaluate whether Nielsen's (1994, 2000, 2002) heuristics, widely accepted as the standard for the design and development of business and commercial websites, also apply to educational web sites.


Comments: I will want to review her Guidelines for educational web sites. These can be correlated with learning-related usability criteria.


Design and development guidelines for educational web sites were written by the researcher based on the study findings. These guidelines were mapped to Nielsen's heuristics as operationalized by the Xerox Heuristic Evaluation Checklist (1995).


Results Discussion


The results of the quantitative and qualitative measures used were analyzed by course and course development type. The most significant results of this study came from the analysis of the variables according to course development type.


The results of the study findings include that the course design type, professional or nonprofessional, was related to usability as measured by students' error rates, Nielsen's heuristic evaluation scores, and student satisfaction scores. The professionally-developed courses were found to be significantly higher in usability than the non-professionally-developed courses by task error rate, Nielsen's heuristic evaluation score and student satisfaction scores.


The analysis of students' verbal reports resulted in three times as many positive comments for the professionally-developed courses when compared to the positive comments for the non-professionally-developed courses.


The results of the quantitative and qualitative measures used were also analyzed by course. When comparisons were made between courses using courses as the unit of analysis the findings were different.


The rank-order of courses was mixed between course types when compared by error rates. The Nielsen's heuristic evaluation scores as measures of usability for educational web sites were not consistent with students' judgments of course usability as measured by error rate scores. There was no relationship between the usability ranking of courses by Nielsen's heuristics and usability as judged by students' error rates.


However, an analysis of students' verbal reports identified 52 common themes and confirmed the importance of Nielsen's heuristics in educational course design.


Comments: She has found that Nielsen's heuristics are important and relevant for educational course design and educational websites. This is a good support for my literature review.


The correlation between the self-efficacy score and error rate means was nonsignificant. The correlation between self-efficacy and error rate was small; very close to zero. There was a small positive correlation between student satisfaction and usability as measured by error rates.


Conclusion


Based on the analysis of the study variables according to course development type, the results of this study found that Nielsen's usability heuristics, a respected evaluation tool used primarily to measure the usability of commercial web sites, can be used to evaluate instructional web sites and used to differentiate between levels of usability in the same way usability is judged by students.



Chapter Two 14

Review of the Literature 14

Theoretical Framework 15

Usability. 17

Defining Usability 18

Usability and Web-based Instruction 19

Web-based Instruction and Instructional Design 21

Instructional Design 23

Defining Instructional Design 24

The Need to Evaluate Design and Quality 26

Instructional Design Differences . 29

Usability and Aesthetics 32

Usability Guidelines for Interface Design 34

Summary 35


Chapter Three 37

Research Method 37

Introduction . 37

Research Questions and Data Collection . 38

Rationale of the Methodology 41

Xerox Heuristics Evaluation: A System Checklist 42

Rationale 42

Validity . 44

Reliability 48

Method 48

Survey One Computer User Self-efficacy Scale (CUSE) 50

Rationale 50

Validity 52

Reliability 52

Survey Two Student Evaluation of the Quality of Web-Based Instruction 52

Rationale 52

Validity . 54

Reliability 54

Institutional context . 55

Institution . 55

College 55

Online Distance Education Program . 56

Course Management Systems 56

Course Selection 57

Student Selection/Recruitment . 60

IRB Approval 61

Recruitment Procedures 61

Data Collection Procedures . 62

General Collection Procedures . 63

Volunteers for Usability Testing . 64

Follow-up Attempts 65

Quantitative Data Collection 65

Timing of Instrument Use 65

Qualitative Data Collection . 66

Timing of Usability Testing 66

Qualitative Data from the Usability Testing Sessions 67

Usability Testing 67

Usability Testing Site 67

Pre-determined Tasks 69

Think-aloud Protocols 69

Time Log, Video, Audio, & Eye-tracking Data Collection 72

Time Log Data 72

Video, Audio, and Eye-tracking Data 73

Error Log . 75

Summary Interview 77

Post-course Student Evaluation of the Quality of Web-Based Instruction 77

Limitations of the Study 77

Summary 78


Chapter Four . 81

Research Findings 81

Research Questions 83

Summary 114


No comments:

Post a Comment