Position Paper for CHI 96 Basic Research Symposium (April 13-14, 1996, Vancouver, BC)

The Overnight Staff An Approach to Teaching Short Courses on User Interface Design and Evaluation

Joseph A. Konstan, Susan Herbst, Doug Perrin and Zbigniew Wieckowski
Department of Computer Science
University of Minnesota
Minneapolis, MN 55455
{konstan,herbst,dperrin,wieckows}@cs.umn.edu
+1 612 625-4002

ABSTRACT

User interface design is an iterative, hands-on practice. While it is possible to model that practice in a quarter- or semester-long university course, it is much more difficult to do so in an industrial short course limited to three days. We introduce the "overnight staff" approach as a way to intro- duce the benefits of hands-on iteration into a time-limited course structure. The key feature of this approach is a staff of implementers that take student designs each evening and return with task-focused prototypes the next morning. These prototypes are used for interface evaluation and give students a strong sense of interface design iteration.

KEYWORDS: Education, User Interface Design and Evaluation, Short Courses, Industrial Education

BACKGROUND

During the summer of 1995, at the request of local industry, the first author designed and taught a three-day hands-on short course on user interface design and evaluation. One of the greatest challenges in designing this course was crafting active learning experiences [1,3] that would replace the term-long project that is used during quarter-length classes. In particular, the goals of this course included:

Once course content was decided, the remaining significant issue was how to structure the design project. The students were not all programmers, and most had only passing familiarity with application development tools, and there- fore having students actually implement prototype inter- faces was ruled out (it would have been a waste of time anyway, as most available tools require too much training). Instead, we decided to use paper prototypes. In particular, students would learn about usability in general and task analysis in particular during the first day, and then produce a prototype at the end of the day. The second day, they would learn about and conduct three usability evaluations and revise their prototypes.

Our earlier experiences using paper prototypes for interface evaluation in introductory courses were uneven. While stu- dents found many usability problems, they often attributed problems to fidelity problems in the paper interface, and did not fully appreciate the usability feedback. To address this problem, we created the "overnight staff."

THE OVERNIGHT STAFF

The overnight staff, which consisted of the second through fourth authors, was responsible for turning paper proto- types into task-focused executable prototypes. Each group of 3-4 students was assigned a member of the staff. Specifi- cally, the staff responsibility included:

The overnight staff was under strict instructions to avoid assisting with any aspect of the interface design. They assumed the role of programmers who could (and did) implement whatever was designed. In addition to the inter- face itself, the staff would mock up just enough of the inter- face to support the tasks required. For example, one group's interface required a function that would map from phone number to time zone. The staff implemented that function only for the specific phone numbers needed for the identi- fied tasks.

PUTTING IT ALL TOGETHER

With the overnight staff in place, the structure of the class was complete. The first day would lead up to the design of a world clock (students would take turns performing task analysis and serving as users for task analysis). Students were given approximately one hour after task analysis was completed for the design and meeting with their staff mem- ber. The only tools provided were "white board sheets" and markers.

The next morning, students were given a few minutes to explore their creations, and then they moved on to interface evaluation. They conducted small versions of cognitive walkthrough [5] and heuristic analyses (using Nielsen and Molich's heuristics [4]) and then conducted user tests using both new subjects (in this case, a different member of the overnight staff) and the original users from the task analysis exercise. At each step, the groups made lists of usability problems and suggestions for improvements. Finally, they took the three lists and spent another hour refining the inter- face and meeting with their staff member.

The final morning, students spent some time seeing the two versions of each prototype side-by-side. This activity led to a discussion about the types of problems detected by each usability evaluation technique. The rest of the third day focused on other topics, but the projects were used as exam- ples whenever possible.

RESULTS

What Worked Well?

The use of the overnight staff worked extremely well over- all. Every single student filling out the course evaluation form indicated that the overnight implementation was one of the most important features of the course. In particular, comments from students indicated that they very much val- ued seeing the two prototypes side-by-side and that they now understood the value of iterating through design (if only modeled as a single iteration). Furthermore, many stu- dents commented that this model reflected their actual work environment (though their own programmers were neither as efficient nor as flexible).

A welcome, if unintended, benefit from this method is the quality of training that it provided to the overnight staff itself. The three graduate students recruited as staff gained an unforgettable lesson in the dynamics of rapid prototyp- ing, demanding designers, and programming with little and less sleep.

What Didn't?

The biggest problems we faced came from the complete design freedom that we gave to the students in the class. The students had different experiences with graphical applications (ranging from Macintosh to Windows to X and beyond) and they drew from a collection of interface wid- gets far vaster than any individual toolkit or framework could provide. The staff, which was well-trained in Tcl/Tk, mocked up many of these interfaces using available tools, but the creation of new widgets took nearly half of their programming time. In future offerings, we will provide a list of available interface objects (perhaps with cut-out pic- tures) to constrain the design. This should allow us to serve twice as many students and should also increase the realism by modeling design that is constrained to standard toolkits.

The other problem was the amount of time it took for the overnight staff to implement the prototypes. In retrospect, it would have saved much time if the overnight staff had used modern prototyping tools rather than a GUI toolkit. The reason GUI toolkits were used is that our graduate students learn to use them in their UI courses, and they do not learn to use other prototyping tools. This experience points to the need to introduce our students to more sophisticated proto- typing tools in our regular courses.

WHERE TO GO NEXT

In the months since this initial use of the overnight staff concept, we have thought about several extensions and refinements:

However we proceed, we expect to continue to use active, hands-on learning techniques supported by rapid prototyp- ing staff.

ACKNOWLEDGEMENTS

We are indebted to Larry Rowe at U.C. Berkeley for the world clock design exercise.

REFERENCES

1. Johnson, D.W., Johnson, R.T., and Smith, K.A. Active Learning: Cooperation in the College Classroom. Interaction Book: Edina, MN, 1991.

2. Lewis, C. and Rieman, J. Task-Centered User Inter- face Design (1994). ftp://ftp.cs.colorado.edu/pub/cs/ distribs/clewis/HCI-Design-Book/

3. Meyers, C. and Jones, T.B. Promoting Active Learn- ing: Strategies for the College Classroom. Jossey- Bass: San Francisco, 1993.

4. Nielsen, J. and Molich, R. "Heuristic Evaluation of User Interfaces." Proc. CHI '90. ACM: New York

5. Wharton, C., et. al. "The Cognitive Walkthrough: A Practitioners Guide." in Nielsen, J. and Mack, R.L. (eds), Usability Inspection Methods, John Wiley & Sons: New York, 1994.