La lecture en ligne est gratuite
Le téléchargement nécessite un accès à la bibliothèque YouScribe
Tout savoir sur nos offres
Télécharger Lire

University of Texas at Austin School of Information IT Lab ...

32 pages
University of Texas at Austin
School of Information
IT Lab Tutorial Web site
Final Usability Report
Kristin Davis
INF385G, School of Information
The University of Texas at Austin
December 8, 2005 iSchool Tutorial Web site Usability Report 1 
Table of Contents 
1. How this report is structured................................................................................2 
2. Executive Summary...............................................................................................3 
3. Introduction...........................................................................................................3 
3.1. Study Purpose ...................................................................................................3 
3.2. Study Methods and Context..............................................................................3 
3.3. Study Summary .................................................................................................4 
3.4. User Profile .......................................................................................................4 
3.5. What this study is NOT......................................................................................4 
4. Methodology..........................................................................................................4 
4.1. End‐user Test Methodology...............................................................................4 
4.2. Online Survey Methodology............. ...
Voir plus Voir moins
University of Texas at Austin 
School of Information 
IT Lab Tutorial Web site 
FinalUsabilityReport 
Kristin Davis INF385G, School of Information The University of Texas at Austin December 8, 2005
iSchool Tutorial Web site Usability Report
Table of Contents 1. How this report is structured................................................................................ 2 2. Executive Summary............................................................................................... 3 3. Introduction........................................................................................................... 3 3.1. Study Purpose................................................................................................... 3 3.2. Study Methods and Context.............................................................................. 3 3.3. Study Summary................................................................................................. 4 3.4. User Profile....................................................................................................... 4 3.5. What this study is NOT...................................................................................... 4 4. Methodology.......................................................................................................... 4 4.1. End‐user Test Methodology............................................................................... 4 4.2. Online Survey Methodology............................................................................... 6 4.3. Evaluation Measures.......................................................................................... 7 5. Results................................................................................................................... 8 5.1. End‐user Testing: Performance Data................................................................ 8 5.2. End‐user Testing: Satisfaction Data.................................................................. 9 5.3. End‐user Testing: Usability Findings............................................................... 10 5.4. End‐user Testing: User Comments.................................................................. 10 5.5. Online Questionnaire: Result Highlights......................................................... 11 6. Contact Information................................1. 2........................................................... 7. Appendix A: End‐user Testing‐Consent Form.13 .................................................... 8. Appendix B: End–user Testing‐Task Descriptions...............4 .1................................ 9. Appendix C: End‐user Testing–Posttest Questionnaire..............................1. 6........ 10. Appendix D: Online Survey Consent Form and Questionnaire9 ...1........................ 11. Appendix E: End‐user Testing–Performance Data 32.............................................. 12. Appendix F: End‐user Testing–Posttest Questionnaire Data..42 ............................ 13. Appendix G: Online Survey Questionnaire Data ..26............................................... 14. Appendix H – Tutorial Web site Home Page....................................................... 92 15. Appendix I – Alternate Home Page Designs.....................................................3...0
1 
iSchool Tutorial Web site Usability Report
1. How this report is structured 
2 
The usability findings within this report are offered in three levels of detail. For the reader interested in only the overall impressions, a one‐pageExecutive Summaryis offered on the next page. For the reader interested in summaries of all the findings, including the performance data (success rate, time on task), the satisfaction data (the questionnaire results), and, most importantly, the specific usability findings in Section 5.3, there is thebody of the paper. Finally, all the detail of the methods and the raw data are provided in theAppendices, for only the reader interested in the smallest detail.
iSchool Tutorial Web site Usability Report
2. Executive Summary 
3 
The University of Texas at Austin School of Information IT Lab agreed to work with usability consultant Kristin Davis to conduct a usability evaluation of their existing tutorial Web site. Working with IT Lab staff, the usability consultant designed an end‐user test for November, 2005. The test was carried out at the Information eXperience Lab of the University of Texas at Austin the week of November 21, 2005 Six representative users were tested, with a median age range of 30‐39. All six of the participants were female. An online survey was also conducted during the month of November, 2005. The survey generated 50 responses from the School of Information student listserv. Overall, the results are promising but also indicative that some redesign work is needed. The performance data suggest that users can carry out all tested tasks in a reasonable amount of time. The satisfaction data were positive. Participants generally thought the interface was functional but that some of the navigation terms were confusing. The most important data here are the specific usability problems unearthed. There was only one Critical error found. Section 5.3 details one Minor problem, along with recommended solutions. The areas that earned criticality ratings include:Could not find SnagIt tutorialDifficulty finding BevoWare tutorial 
3. Introduction 3.1. Study Purpose The purpose of this study was to test the usability of the Web site for the University of Texas at Austin, School of Information, IT Lab tutorials (not the lab or the tutorials themselves), using representative users. The goal of this study was to determine the usability of the current Web site and explore the primary question of whether graphics, text, or some combination of graphics and text are more usable. Secondary questions will examine effectiveness of current labeling, categorization, and color choices. 
3.2. Study Methods and Context In this study we employed an “end‐user test” method, wherein individual representative users were tested, one at a time. Test participants came to a usability lab, were given some representative tasks to perform, and were observed and measured as they carried out these tasks. This was a “find‐and‐fix” test, interested in identifying potential usability problems. Additionally, an online survey was administered wherein approximately 300 representative users were invited to participate in the survey. The participants were asked to explore the Web site, locate several different tutorials, and complete an online questionnaire.
iSchool Tutorial Web site Usability Report
4 
3.3. Study Summary 3.3.1. Enduser Testing Six representative users were tested one at a time on the IT Lab tutorial Web site. We wished to collect performance data (time on task, error rates), satisfaction data (via a questionnaire), and, most importantly, particular areas of potentially poor usability. Through meetings with IT Lab staff, we agreed on a series of tasks to get at the following questions: 1. Participants begin on the Home: Please locate the following tutorials: Photoshop, Dreamweaver, BevoWare, and SnagIt.Designers are curious to know how users locate tutorials within the current Website and how the current organizational structure of the page impacts that process. 2. Participants view two alternative designs for the tutorial Web site: Please compare and comment on the different design alternatives.know which particular elements of the alternative Web site designsDesigners wish to appeal to the users. 
3.3.2. Online Questionnaire An online survey was conducted to supplement the end‐user usability testing. Satisfaction data was collected via a questionnaire, and comments were solicited regarding design to ascertain particular areas of potentially poor usability. Tasks and questions were modeled after the end‐ user test. 
3.4. User Profile The types of people who might visit the iSchool tutorial Web site fall into three categories, including iSchool students (primary users), University of Texas at Austin students (secondary users), and ‘everyone’ (tertiary users). For the purposes of this test iSchool students were selected as test participants. 3.5. What this study is NOT This study is NOT a baseline test, with crisp, quantitative data like time‐on‐task, to compare future performance to. Rather, it is a find‐and‐fix usability study, with the main goal being to identify possible usability problems. 4. Methodology 4.1. End‐user Test Methodology 4.1.1. Participants As can be seen in Figure 1, six participants were tested. All six of the participants were female. The median age range was 30‐39 years old.
iSchool Tutorial Web site Usability Report
5 
Figure 1: End‐user Test Participants Participant Gender Age Range How long in the FT or PT Degree being pursued # iSchool? student? 1 F 18‐29 1‐2 years FT PhD 2 F 30‐39 1‐2 years FT MS 3 F 30‐39 2‐3 years PT MS 4 F 30‐39 1‐2 years FT MS 5 F 18‐29 1‐2 years FT MS 6 F 30‐39 Less than 1 year FT MS 
4.1.2. Procedure Test participants were welcomed into the lab at the University of Texas at Austin. They were seated at a computer workstation in the test room of the lab, and observed by one test moderator, who stayed in the test room, interacted with the participant, and took notes. Test participants were shown the observation room, they read and signed an informed consent form, and then listened and read along as the moderator read the instructions (Appendix A). If there were no questions, the moderator read the first task description, as the participant read along, and then the participant attempted to carry out the task. The participants were then asked to look at two alternative designs and offer comments. At the completion of the last task the participant completed the post test questionnaire (Appendix C), was thanked and escorted out of the lab. The entire session took between 20 and 30 minutes. 4.1.3. Task Scenarios The usability consultants worked with IT Lab staff to derive a set of tasks (Appendix B) that would be representative of an iSchool student user. The tasks included:Locate the Photoshop tutorialLocate the Dreamweaver tutorialLocate the BevoWare tutorialLocate the SnagIt tutorialalternate designs and comment on themView
iSchool Tutorial Web site Usability Report
6 
4.2. Online Survey Methodology 4.2.1. Participants As can be seen in Figure 2, 29‐32 participants answered questions regarding demographics. 65.6% of the participants were female. 45.2% of the participants fell in the age range of 18‐29 years old. 
Figure 2: Online Questionnaire Participants Female‐32 Gender Male‐34.4% 65.6% respondents 18 to 29‐ 31 to 49‐ 4030 to 39‐ Age range 45.2% 16.1% Over 29.0% respondents 50‐9.7% How long inLess than 11 to 2 years‐ Over2 to 3 years‐ 3 years‐ 29 the iSchool? year‐31% 41.4% 17.2% 10.3% respondents FT or PTPart‐time‐Full‐time‐29 student? 10.3% 89.7% respondents Degree being pursued? MS‐82.8% PhD‐17.2%
29 respondents 
4.2.2. Procedure Approximately 300 representative users were invited to participate in the survey. Participants read and agreed to an informed consent form, engaged in an exploration period and posttest questionnaire, which entailed questions regarding Internet usage, online experience with the tutorial Web site, and demographics (See Appendix D). The participants were thanked; the entire session took between 10 and 15 minutes. 
4.2.3. Task Scenarios The usability consultants worked with IT Lab staff to derive a set of tasks (Appendix B) that would be representative of an iSchool student user. The tasks included:Locate the Visio tutorialLocate the Screenshots with SnagIt 6 tutorialLocate the Mozilla Composer tutorial
iSchool Tutorial Web site Usability Report
7 
4.3. Evaluation Measures In the next section, “Results,” we will offer summaries of performance data and summaries of the questionnaire responses. Also in the Results will be a prioritized list of identified usability problems and recommended fixes. The problems noted by the evaluator will be given a criticality rating per the following table. The higher the rating of criticality, the more significant the problem is to the user’s experience or ability to accomplish the task. 
Figure 3: Criticality Ratings CRITICALITY CRITICALITY RATING DESCRIPTION
4
3
2
Critical
Major
Moderate
CRITICALITY DEFINITION The identified issue is so severe that:Critical data may be lostThe user may not be able to complete the taskuser may not want to continue using theThe application can accomplish the task but only withUsers considerable frustration and/or performance of unnecessary steps by participantsNon‐critical data may be lostThe user will have great difficulty in circumventing the problemUsers can overcome the issue only after they have been shown how to perform the taskFive or more instances of navigational error
The user will be able to complete the task in most cases, but will undertake some moderate effort in getting around the problemThe user may need to investigate several links or pathways through the system to determine which option will allow them to accomplish the intended taskUsers will most likely remember how to perform the task on subsequent encounters with the systemThree or four instances of navigational error
iSchool Tutorial Web site Usability Report
An irritantoblem1 Minor •A cosmetic pr A typographical errorOne or two instances of navigational error 
8
5.Results This section summarizes the data collected during the study sessions. It is important in usability studies to address both performance data and preference data, as we are interested in not only CAN users complete their tasks, but do they like the Web site. Thus, this section consists of three parts:Performanceof the participants while attempting the tasks (task completion percentages),Satisfactionof the participants as noted in their post‐test questionnaire responses, andParticular usability findings, compiled from both participants comments and moderator observations and notes. Under the rubric of “constructive criticism,” high‐level recommendations are offered for all the identified usability problems. However, the IT Lab will always best know their own services (and certainly their possible resource allocations). 5.1. End‐user Testing: Performance Data The participants’ success and time on task were recorded, for each task. The numbers of errors were noted as well as their level of criticality. Time on task was recorded to the nearest 15 seconds. Time on task is a valuable data point, particularly taken with the dialog between the test participant and the test moderator, revealing the logic behind the participants’ choices. 
Figure 4: End‐user Testing Performance Data 
Median Time Number of Median Successful Unsuccessful on Task in Participants Error completion completion Minutes with Errors Criticality Task 1 6 0 15 seconds 0 0 Task 2 6 0 15 seconds 0 0 Task 3 6 0 15 seconds 2 1 Task 4 5 1 30 seconds 1 4
iSchool Tutorial Web site Usability Report
5.2. End‐user Testing: Satisfaction Data Participants answered a posttest questionnaire (Appendix C) after completion of the last task. The detailed answers are in Appendix F. Figure 4 summarizes the participants’ responses to the posttest questionnaire. 
Figure 5: End‐user Testing Satisfaction Data 
Question Responses Yes No Have you ever used the tutorial Web site before today? 17% 83% Yes No Did the categories of items on the tutorial page make sense to you? 100% 0% Easy Average How hard did you think it was to complete the tasks? 100% 0% A lot less than I Less than I expected expected How much time did it take to complete the tasks? 0% 17% Were you able to find all the Yes No information you expected to find? 100% 0% Based on your experience, how would Above average Average you rate the organization of the tutorial Web site? 67% 33% Would you consider visiting the Yes No tutorial Website for your personal needs? 83% 17% Yes No Would you recommend the tutorial Web site to others? 100% 0%
Difficult 0% About what I expected 83%
Below average 0%
9 
iSchool Tutorial Web site Usability Report
10 
5.3. End‐user Testing: Usability Findings As this study was a “find‐and‐fix” usability study, these specific findings are the most important findings in the study. For usability studies such as this, it is important to key on the particular problems identified, and to figure out how to address each. The following findings include a criticality rating as described in Section 4.3. The findings below are divided into Critical and Minor problems. Each has an associated recommendation. 5.3.1. Good things The Web site contains a great deal of information and received positive satisfaction ratings from participants. One hundred percent of participants rated the test tasks as easy. Sixty‐seven percent of the participants rated the organization of the tutorial Web site as above average. 5.3.2. Critical & Major Findings There was only one problem that rated a criticality score of “critical” or “major”. We believe that the Web site is functional as it is today. However, there are several moderate and minor findings that, if not addressed, will lead to user frustration. 5.3.3. Critical Findings Finding Critical 1: Could not find SnagIt tutorial. Criticality: 4 Description:the SnagIt tutorial, even though the tutorial was on aParticipant could not find page which was browsed during the search. Recommendation:Rename tutorials so that the primary search term appears at the beginning of the tutorial title. 5.3.4. Minor findings Finding Minor 1:Difficulty finding BevoWare tutorial Criticality:1 Description:Participants expressed difficulty determining which category the tutorial would be classified under. Recommendation:rework category labels, consider card‐sorting exercises 
5.4. End‐user Testing: User CommentsConfussion regarding order within categories: is it alphabetical, newest, most popular, random?Unclear category headingsBrief descriptions under category headings might help
Un pour Un
Permettre à tous d'accéder à la lecture
Pour chaque accès à la bibliothèque, YouScribe donne un accès à une personne dans le besoin