Position Paper for CHI 96 Basic Research Symposium (April 13-14, 1996, Vancouver, BC)

Toward Understanding Human-Computer Interaction


James H. Hicinbothom, Lead Scientist, CHI Systems, Inc.

One of the most poorly understood, and least used, approaches to understanding human-computer interaction (HCI) is analysis of the details of the substance of HCI in the context of the common ground shared by both the human and the computer. This substance consists of the human's computer-readable actions (e.g., keystrokes or other manipulations of various interaction devices) and the computer's human-sensible actions (e.g., changes in display contents perceptible to the human visual system, audible signals or other indications of computer actions). The context in which these actions take place is, of course, the common ground of shared (or at least somewhat 'in-synch') models of each other and of the wider world in which this interaction takes place. But the details of this complex interaction are infrequently studied in complex real-world situations. Keystroke-level analysis is seen as just too difficult and unwieldy by many researchers and practitioners within the CHI community. The principal focus throughout my twelve year research career in HCI has been on better understanding HCI through analysis and modeling of the gory details of human-computer interactions in real-time multi-tasking domains involving complex real-world problems. On-going research on several projects is focused specifically on near-real-time automated analysis of HCI via keystroke analysis and user modeling to enable semi-automated diagnosis in embedded training situations. Additionally, research on user-model-based Dynamic Interface Agents is also focusing on monitoring the lowest-level details of HCI to develop interpretations of what the human operator is trying to do, and then providing appropriate context-dependent aid to that operator.

To overcome the problems associated with collection and analysis of keystroke-level data, I lead an effort to design and develop a prototype toolset and environment for automated capture of keystroke-level data from a wide variety of applications. The Instrumented Interface Construction (IICON) Evaluator  enables collection, management, replay, analysis, and annotation of keystroke-level data recorded from an instrumented interface via embedded IICON Data Taps. Currently, IICON Evaluator exists as a working prototype on Sun workstations under SunOS 4.1.x, using the X Window System (X11R5) and OSF/Motif widget set and window manager (mwm). Developed for the US Army Research Laboratory's Human Research and Engineering Directorate, IICON Evaluator is intended to provide both a research environment and a practical industrial tool for usability evaluations and user interface design studies. Figure 1 depicts an overview of the functional architecture of IICON Evaluator.

Understanding HCI, of course, requires far more than just lots of keystroke-level data! However, I contend that cognitive task analyses and the like that are not based on, and traceable back to, keystroke-level transcripts of recorded human-computer interaction are suspect. Each level of data and analysis is vital for understanding different aspects of HCI, each supporting and enhancing understanding achieved at the other levels of analysis. Building up from keystroke-level data (i.e., details of how actions are performed) to function-level data (i.e., actions, or steps) is extremely important for both pragmatic and fundamental reasons, including understanding the how's of observed HCI. Similarly, building on function-level data to create method-level data (i.e., procedures) is essential to understanding much of HCI, including the what's and strategies of observed HCI. Furthermore, the next step up, to iterative task-level data addressing users' goals and cognitive tasks, is essential to understanding the why's of HCI. Only through integrating all levels of analysis, and having the data readily available to trace back through, can we hope to truly understand observed human-computer interaction.


Hicinbothom, J., Weiland, W., Santarelli, T., Voorhees, J. (1995). AMMI: Active Man-Machine Interface for Advanced Rotorcraft, Final Report. Spring House, PA: CHI Systems, Inc.

Hicinbothom, J. H., Watanabe, M. M., Weiland, W., Boardway, J. B., & Zachary, W. W. (1994). A Toolset for Systematic Observation and Evaluation of Computer-Human Interaction. Demonstration presented at the Conference on Computer-Human Interaction, CHI'94 (Boston, MA, April 24-April 28, 1994). In Proceedings of CHI, Vol. 2 (Conference Companion), 1994, New York, NY.

Zachary, W., Zaklad, A., Hicinbothom, J., Ryder, J., & Purcell, J. (1993). COGNET representation of tactical decision-making in Anti-Air Warfare. Paper presented in the Proceedings of Human Factors and Ergonomics Society, 37th Annual Meeting, Santa Monica, CA.

Zachary, W. W., & Hicinbothom, J. H. (1993). A Design Tool for Integrated Decision-Aiding/Training Embedded Systems (IDATES) Final Report. Spring House, PA: CHI Systems, Inc.

Hicinbothom, J. H., & Zachary, W. W. (1993). Tool for Automatically Generating Transcripts of Human-Computer Interaction, Paper presented in the Proceedings of the Human Factors and Ergonomics Society 37th Annual Meeting (Seattle, WA), Santa Monica, CA.

Zachary, W., Glenn, F. A., & Hicinbothom, J. (1988). Architectures for Intelligent Computer-Assisted Embedded Training. Paper presented in the Proceedings of the 1988 ADCIS (Association for the Development of Computer-Based Instructional Systems) Conference, Philadelphia, PA, 1988.