skip to main content
article
Free Access

Extracting usability information from user interface events

Published:01 December 2000Publication History
Skip Abstract Section

Abstract

Modern window-based user interface systems generate user interface events as natural products of their normal operation. Because such events can be automatically captured and because they indicate user behavior with respect to an application's user interface, they have long been regarded as a potentially fruitful source of information regarding application usage and usability. However, because user interface events are typically voluminos and rich in detail, automated support is generally required to extract information at a level of abstraction that is useful to investigators interested in analyzing application usage or evaluating usability. This survey examines computer-aided techniques used by HCI practitioners and researchers to extract usability-related information from user interface events. A framework is presented to help HCI practitioners and researchers categorize and compare the approaches that have been, or might fruitfully be, applied to this problem. Because many of the techniques in the research literature have not been evaluated in practice, this survey provides a conceptual evaluation to help identify some of the relative merits and drawbacks of the various classes of approaches. Ideas for future research in this area are also presented. This survey addresses the following questions: How might user interface events be used in evaluating usability? How are user interface events related to other forms of usability data? What are the key challenges faced by investigators wishing to exploit this data? What approaches have been brought to bear on this problem and how do they compare to one another? What are some of the important open research questions in this area?

References

  1. ABBOTT, A. 1990. A primer on sequence methods. Organ. Sci. 4.]]Google ScholarGoogle Scholar
  2. AGRAWAL, R., ARNING, A., BOLLINGER, T., MEHTA, M., SHAFER,J.,AND SRIKANT, R. 1996. The Quest data mining system. In Proceedings of the 2nd International Conference on Knowledge Discovery in Databases and Data Mining.]]Google ScholarGoogle Scholar
  3. AHO,A.V.,KERNIGHAN,B.W.,AND WEINBERGER,P.J. 1988. The AWK Programming Language. Addison-Wesley, Reading, MA.]] Google ScholarGoogle Scholar
  4. ALLISON,P.D.AND LIKER, J. K., 1982. Analyzing sequential categorical data on dyadic interaction: A comment on Gottman. Psychological Bulletin 2.]]Google ScholarGoogle Scholar
  5. AQUEDUCT SOFTWARE. 1998. AppScope Web Pages. URL: http://www.aqueduct.com/.]]Google ScholarGoogle Scholar
  6. BADRE,A.N.AND SANTOS, P. J. 1991a. CHIME: A Knowledge-Based Computer-Human Interaction Monitoring Engine. Tech Rept. GIT-GVU-91- 06.]]Google ScholarGoogle Scholar
  7. BADRE,A.N.AND SANTOS, P. J. 1991b. A Knowledge-Based System for Capturing Human-Computer Interaction Events: CHIME. Tech. Rept. GIT-GVU- 91-21.]]Google ScholarGoogle Scholar
  8. BADRE,A.N.,GUZDIAL, M., HUDSON,S.E.,AND SANTOS, P. J. 1995. A user interface evaluation environment using synchronized video, visualizations, and event trace data. J. of Software Qual. 4.]]Google ScholarGoogle Scholar
  9. BAECKER,R.M,GRUDIN, J., BUXTON,W.A.S.,AND GREENBERG, S., Eds. 1995. Readings in Human-Computer Interaction: Toward the Year 2000. Morgan Kaufmann, San Mateo, CA.]] Google ScholarGoogle Scholar
  10. BALBO, S. 1996. EMA: Automatic Analysis Mechanism for the Ergonomic Evaluation of User Interfaces. CSIRO Tech. rep.]]Google ScholarGoogle Scholar
  11. BATES, P. C. 1995. Debugging heterogeneous distributed systems using event-based models of behavior. ACM Trans. on Comput. Syst. 13,1.]] Google ScholarGoogle Scholar
  12. BELLOTTI, V. 1990. A framework for assessing applicability of HCI techniques. In Proceedings of INTERACT '90.]] Google ScholarGoogle Scholar
  13. BUXTON, W., LAMB, M., SHEMAN,D.,AND SMITH,K. 1983. Towards a comprehensive user interface management system. In Proceedings of SIGGRAPH '83.]] Google ScholarGoogle Scholar
  14. CHANG,E.AND DILLON, T. S. Automated usability testing. In Proceedings of INTERACT '97.]] Google ScholarGoogle Scholar
  15. CHEN, J. 1990. Providing intrinsic support for user interface monitoring. In Proceedings of INTER- ACT '90.]] Google ScholarGoogle Scholar
  16. COOK,J.E.AND WOLF, A. L. 1994. Toward metrics for process validation. In Proceedings of ICSP '94.]]Google ScholarGoogle Scholar
  17. COOK,J.E.AND WOLF, A. L. 1995. Automating process discovery through event-data analysis. In Proceedings of ICSE '95.]] Google ScholarGoogle Scholar
  18. COOK,J.E.AND WOLF, A. L. 1997. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model. Tech. Rep. CU-CS-840-97, Dept. of Computer Science, Univ. of Colorado at Boulder.]]Google ScholarGoogle Scholar
  19. COOK, R., KAY, J., RYAN,G.,AND THOMAS, R. C. 1995. A toolkit for appraising the long-term usability of a text editor. Software Qual. J. 4,2.]]Google ScholarGoogle Scholar
  20. CUOMO, D. L. 1994. Understanding the applicability of sequential data analysis techniques for analysing usability data. In Usability Laboratories Special Issue of Behavior and Information Technology, J. Nielsen, Ed., vol. 13, no.1 & 2.]]Google ScholarGoogle Scholar
  21. CYPHER, A. 1991. Eager: programming repetitive tasks by example. In Proceedings of CHI '91.]] Google ScholarGoogle Scholar
  22. CYPHER, A., Ed. 1993. Watch what I do: Programming by Demonstration. MIT Press, Cambridge MA.]] Google ScholarGoogle Scholar
  23. DODGE,M.AND STINSON, C. 1999. Running Microsoft Excel 2000. Microsoft Press.]] Google ScholarGoogle Scholar
  24. DOUBLEDAY, A., RYAN, M., SPRINGETT, M., AND SUTCLIFFE, A. 1997. A comparison of usability techniques for evaluating design. In Proceedings of DIS '97.]] Google ScholarGoogle Scholar
  25. ELGIN, B. 1995. Subjective usability feedback from the field over a network. In Proceedings of CHI '95.]] Google ScholarGoogle Scholar
  26. ERGOLIGHT USABILITY SOFTWARE. 1998. Operation Recording Suite (EORS) and Usability Validation Suite (EUVS) Web pages. URL: http:/ /www.ergolight.co.il/.]]Google ScholarGoogle Scholar
  27. FARAONE,S.V.AND DORFMAN, D. D. 1987. Lag sequential analysis: Robust statistical methods. Psychological Bulletin 101.]]Google ScholarGoogle Scholar
  28. FEATHER,M.S.,NARAYANASWAMY, K., COHEN,D., AND FICKAS, S. 1997. Automatic monitoring of software requirements. Research Demonstration. In Proceedings of ICSE '97.]] Google ScholarGoogle Scholar
  29. FICKAS,S.AND FEATHER, M. S. 1995. Requirements monitoring in dynamic environments. In IEEE International Symposium on Requirements Engineering.]] Google ScholarGoogle Scholar
  30. FINLAY,J.AND HARRISON, M. 1990. Pattern recognition and interaction models. In Proceedings of INTERACT '90.]] Google ScholarGoogle Scholar
  31. FISHER, C. 1987. Advancing the study of programming with computer-aided protocol analysis. In Empirical Studies of Programmers, 1987 Workshop, G. Olson, E. Soloway, and S. Sheppard, Eds. Ablex, Norwood, NJ.]] Google ScholarGoogle Scholar
  32. FISHER, C. 1991. Protocol Analyst's Workbench: Design and Evaluation of Computer-Aided Protocol Analysis. Unpublished Ph.D. thesis, Carnegie Mellon University, Dept. of Psychology, Pittsburgh, PA.]]Google ScholarGoogle Scholar
  33. FISHER,C.AND SANDERSON, P. 1996. Exploratory sequential data analysis: Exploring continuous observational data. Interactions 3, 2, ACM Press.]] Google ScholarGoogle Scholar
  34. FITTS, P. M. 1964. Perceptual motor skill learning. In Categories of human learning, A. W. Melton, Ed. Academic Press, New York, NY.]]Google ScholarGoogle Scholar
  35. FULL CIRCLE SOFTWARE. 1998. Talkback Web pages. URL: http://www.fullsoft.com/.]]Google ScholarGoogle Scholar
  36. GIRGENSOHN, A., REDMILES,D.F.,AND SHIPMAN, F. M. III. 1994. Agent-based support for communication between developers and users in software design. In Proceedings of the Knowledge-Based Software Engineering Conference '94. Monterey, CA, USA.]]Google ScholarGoogle Scholar
  37. GOODMAN, D. 1998. Complete HyperCard 2.2 Hand-book. ToExcel.]] Google ScholarGoogle Scholar
  38. GOTTMAN,J.M.AND ROY, A. K. 1990. Sequential analysis: A guide for behavioral researchers. Cambridge University Press, Cambridge, England.]]Google ScholarGoogle Scholar
  39. GRUDIN, J. 1992. Utility and usability: Research issues and development contexts. Interacting with comput. 4,2.]]Google ScholarGoogle Scholar
  40. GUZDIAL, M. 1993. Deriving Software Usage Patterns from Log Files. Tech. Rept. GIT-GVU-93- 41.]]Google ScholarGoogle Scholar
  41. GUZDIAL,M,WALTON, C., KONEMANN, M., AND SOLOWAY, E. 1993. Characterizing Process Change Using Log File Data. Tech. Rep. GIT-GVU-93- 44.]]Google ScholarGoogle Scholar
  42. GUZDIAL, M., SANTOS, P., BADRE, A., HUDSON,S.,AND GRAY, M. 1994. Analyzing and visualizing log files: Acomputational science of usability. In Presented at HCI Consortium Workshop.]]Google ScholarGoogle Scholar
  43. HARTSON, H. R., CASTILLO,J.C.,KELSO,J.,AND NEALE, W. C. 1996. Remote evaluation: The network as an extension of the usability laboratory. In Proceedings of CHI '96.]] Google ScholarGoogle Scholar
  44. HELANDER, M., Ed. 1998. Handbook of human-computer interaction. Elsevier Science Publish-ers B.V., North Holland.]] Google ScholarGoogle Scholar
  45. HEWLETT PACKARD. 1998. Application Response Measurement API. URL: http://www.hp.com/ openview/rpm/arm/.]]Google ScholarGoogle Scholar
  46. HILBERT,D.M.AND REDMILES, D. F. 1998a. An approach to large-scale collection of application usage data over the Internet. In Proceedings of ICSE '98.]] Google ScholarGoogle Scholar
  47. HILBERT,D.M.AND REDMILES, D. F. 1998b. Agents for collecting application usage data over the Internet. In Proceedings of Autonomous Agents '98.]] Google ScholarGoogle Scholar
  48. HILBERT, D. M., ROBBINS,J.E.,AND REDMILES,D.F., 1997. Supporting Ongoing User Involvement in Development via Expectation-Driven Event Monitoring. Tech Report UCI-ICS-97-19, Dept. of Information and Computer Science, Univ. of California, Irvine.]]Google ScholarGoogle Scholar
  49. HIRSCHBERG, D. S. 1975. A linear space algorithm for computing maximal common subsequences. Commun of the ACM 18.]] Google ScholarGoogle Scholar
  50. HOIEM,D.E.AND SULLIVAN, K. D. 1994. Designing and using integrated data collection and analysis tools: challenges and considerations. In Usability Laboratories Special Issue of Behavior and Information Technology, J. Nielsen, Ed., vol. 13,no.1 &2.]]Google ScholarGoogle Scholar
  51. HOPPE, H. U. 1988. Task-oriented parsing: A diagnostic method to be used by adaptive systems. In Proceedings of CHI '88.]] Google ScholarGoogle Scholar
  52. JOHN,B.E.AND KIERAS, D. E. 1996a. The GOMSfamily of user interface analysis techniques: Comparison and contrast. ACM Trans. on Comput.- Hum. Interaction 3,4.]] Google ScholarGoogle Scholar
  53. JOHN,B.E.AND KIERAS, D. E. 1996b. Using GOMS for user interface design and evaluation: which technique? ACMTrans. on Comput.-Hum. Interaction 3,4.]] Google ScholarGoogle Scholar
  54. KAY,J.AND THOMAS, R. C. 1995. Studying longterm system use. Commun. of the ACM 38,7.]] Google ScholarGoogle Scholar
  55. KOSBIE,D.S.AND MYERS, B. A. 1994. Extending programming by demonstration with hierarchical event histories. In Proceedings of East-West Human Computer Interaction '94.]] Google ScholarGoogle Scholar
  56. KRISHNAMURTHY,B.AND ROSENBLUM, D. S. 1995. Yeast: A general purpose event-action system. IEEE Trans. on Software Eng. 21, 10.]] Google ScholarGoogle Scholar
  57. LECEROF,A.AND PATERNO, F. 1998. Automatic support for usability evaluation. IEEE Trans. on Software Engin. 24, 10.]] Google ScholarGoogle Scholar
  58. LEE, B. 1996. Remote diagnostics and product lifecycle monitoring for high-end appliances: a new Internet-based approach utilizing intelligent software agents. In Proceedings of the Appliance Manufacturer Conference.]]Google ScholarGoogle Scholar
  59. LEWIS,R.AND STONE, M., Ed. 1999. Mac OS in a Nutshell. O'Reilly and Associates.]] Google ScholarGoogle Scholar
  60. LOWGREN,J.AND NORDQVIST, T. 1992. Knowledgebased evaluation as design support for graphical user interfaces. In Proceedings of CHI '92.]] Google ScholarGoogle Scholar
  61. MACLEOD,M.AND RENGGER, R. 1993. The Development of DRUM: A Software Tool for Video-assisted Usability Evaluation. In Proceedings of HCI '93.]]Google ScholarGoogle Scholar
  62. MANSOURI-SAMANI,M.AND SLOMAN, M. 1997. GEM: A generalized event monitoring language for distributed systems. IEE/BCS/IOP Distributed Syst. Eng. J. 4,2.]]Google ScholarGoogle Scholar
  63. MERCURY INTERACTIVE. 1998. WinRunner and XRunner Web Pages. URL: http://www.merc-int.com/.]]Google ScholarGoogle Scholar
  64. MORAN, T. P. 1981. The command language grammar: A representation for the user interface of interactive computer systems. Int. J. of Man- Machine Studies, 15.]]Google ScholarGoogle Scholar
  65. NEAL,A.S.AND SIMONS,R.M.PLAYBACK: A method for evaluating the usability of software and its documentation. In Proceedings of CHI '83. 1983.]] Google ScholarGoogle Scholar
  66. NIELSEN, J. 1986. A virtual protocol model for computer-human interaction. Int. J. of Man- Machine Studies 24.]] Google ScholarGoogle Scholar
  67. NIELSEN, J. 1993. Usability engineering. Academic Press/AP Professional, Cambridge, MA.]] Google ScholarGoogle Scholar
  68. NIELSEN,J.AND MACK, R. L., Eds. 1994. Usability inspection methods. Wiley, New York.]] Google ScholarGoogle Scholar
  69. NYE,A.AND O'REILLY, T. 1992. X Toolkit Intrinsics Programming Manual for X11, Release 5. O'Reilly and Associates.]] Google ScholarGoogle Scholar
  70. OLSEN,D.R.AND HALVERSEN, B. W. 1988. Interface usage measurements in a user interface management system. In Proceedings of UIST '88.]] Google ScholarGoogle Scholar
  71. OLSON, G. M., HERBSLEB,J.D.,AND RUETER,H.H. 1994. Characterizing the sequential structure of interactive behaviors through statistical and grammatical techniques. Hum.-Comput. Interaction Special Issue on ESDA, Vol. 9.]]Google ScholarGoogle Scholar
  72. PAYNE,S.G.AND GREEN, T. R. G. 1986. Taskaction grammars: A model of the mental representation of task languages. Hum.-Comput. Interaction, Vol. 2.]]Google ScholarGoogle Scholar
  73. PENTLAND, B. T. 1994. A grammatical model of organizational routines. Administrative Sci. Quarterly.]]Google ScholarGoogle Scholar
  74. PENTLAND, B. T. 1994. Grammatical models of organizational processes. Organ. Sci.]]Google ScholarGoogle Scholar
  75. PETZOLD, C. 1998. Programming Windows. Microsoft Press.]] Google ScholarGoogle Scholar
  76. POLSON,P.G.,LEWIS, C., RIEMAN,J.,AND WHARTON, C. 1992. Cognitive walkthroughs: A method for theory-based evaluation of user interfaces. Int. J. Man-Mach. Studies 36, 5, 741- 773.]] Google ScholarGoogle Scholar
  77. PREECE, J., ROGERS, Y., SHARP, H., BENYON, D., HOLLAND, S., AND CAREY, T. 1994. Human-computer inter-action. Addison-Wesley, Wokingham, UK.]] Google ScholarGoogle Scholar
  78. ROSENBLUM, D. S. 1991. Specifying concurrent systems with TSL. IEEE Software, 8,No.3.]] Google ScholarGoogle Scholar
  79. RUBIN, C. 1999. Running Microsoft Word 2000. Microsoft Press.]] Google ScholarGoogle Scholar
  80. SACKETT, G. P. 1978. Observing Behavior, Vol. University Park Press, Baltimore, MD.]]Google ScholarGoogle Scholar
  81. SANDERSON,P.M.AND FISHER, C. 1994. Exploratory sequential data analysis: foundations. Hum.- Comput. Interaction Special Issue on ESDA, 9.]]Google ScholarGoogle Scholar
  82. SANDERSON, P. M., SCOTT,J.J.P,JOHNSTON, T., MAINZER, J., WATANABE,L.M.,AND JAMES, J. M. 1994. Mac-SHAPA and the enterprise of Exploratory Se-quential Data Analysis (ESDA). International of Hum.-Comput. Studies, 41.]] Google ScholarGoogle Scholar
  83. SANTOS,P.J.AND BADRE, A. N. 1994. Automatic chunk detection in human-computer interaction. In Proceedings of Workshop on Advanced Visual Interfaces AVI '94. Also available as Tech Report GIT-GVU-94-4.]] Google ScholarGoogle Scholar
  84. SCHIELE,F.AND HOPPE, H. U. 1990. Inferring task structures from interaction protocols. In Proceedings of INTERACT '90.]] Google ScholarGoogle Scholar
  85. SELBY,R.W.,PORTER, A. A., SCHMIDT,D.C.,AND BERNEY, J. 1991. Metric-driven analysis and feedback systems for enabling empirically guided software development. In Proceedings of ICSE '91.]] Google ScholarGoogle Scholar
  86. SIOCHI,A.C.AND EHRICH, R. W. 1991. Computer analysis of user interfaces based on repetition in transcripts of user sessions. ACM Trans. on Inf. Syst.]] Google ScholarGoogle Scholar
  87. SIOCHI,A.C.AND HIX, D. 1991. A study of computersupported user interface evaluation using maximal repeating pattern analysis. In Proceedings of CHI '91.]] Google ScholarGoogle Scholar
  88. SMILOWITZ,E.D.,DARNELL,M.J.,AND BENSON,A.E. 1994. Are we overlooking some usability testing methods? A comparison of lab, beta, and forum tests. In Usability Laboratories Special Issue of Behavior and Information Technology,J. Nielsen, Ed., Vol. 13, No.1 &2.]]Google ScholarGoogle Scholar
  89. SUN MICROSYSTEMS. 1998. SunTest JavaStar Web Pages. URL: http://www.sun.com/suntest/.]]Google ScholarGoogle Scholar
  90. SWEENY, M., MAGUIRE, M., AND SHACKEL, B. 1993. Evaluating human-computer interaction: A framework. Int. J. of Man-Machine Stud., 38.]] Google ScholarGoogle Scholar
  91. TAYLOR, M. M. 1988a. Layered protocols for computer-human dialogue I: Principles. Int. J. of Man-Machine Stud. 28.]] Google ScholarGoogle Scholar
  92. TAYLOR, M. M. 1988b. Layered protocols for computer-human dialogue II: Some practical issues. Int. J. of Man-Machine Stud. 28.]] Google ScholarGoogle Scholar
  93. TAYLOR,R.N.AND COUTAZ, J. 1994. workshop on software engineering and human-computer interaction: Joint research issues. In Proceedings of ICSE '94.]] Google ScholarGoogle Scholar
  94. TIBCO, 1998. HAWK Enterprise Monitor Web Pages. URL: http://www.tibco.com/.]]Google ScholarGoogle Scholar
  95. UEHLING,D.L.AND WOLF, K. 1995. User Action Graphing Effort (UsAGE). In Proceedings of CHI '95.]] Google ScholarGoogle Scholar
  96. USER MODELING INC. (UM Inc.), 1998. Home Page. URL: http://um.org/.]]Google ScholarGoogle Scholar
  97. WEILER, P. 1993. Software for the usability lab: a sampling of current tools. In Proceedings of IN- TERCHI '93.]] Google ScholarGoogle Scholar
  98. WHITEFIELD, A., WILSON,F.,AND DOWELL, J. 1991. A framework for human factors evaluation. Behav. and Inf. Technol., 10,1.]]Google ScholarGoogle Scholar
  99. WOLF,A.L.AND ROSENBLUM, D. S. 1993. A study in software process data capture and analysis. In Proceedings of the Second International Conference on Software Process.]]Google ScholarGoogle Scholar
  100. ZUKOWSKI,J.AND LOUKIDES, M., Ed. 1997. Java Awt Reference. O'Reilly and Associates.]] Google ScholarGoogle Scholar

Recommendations

Reviews

Dara Lee Howard

This survey and its resulting framework provide a useful compilation of information to aid designers and students of software interfaces. The framework provides a conceptual evaluation and identifies relative merits and drawbacks of classes of event data capture. A framework is constructed to categorize, compare, and evaluate the relative strengths and limitations of approaches to capture the natural products of the normal operation of window-based user interface systems. Synchronization and searching, transformation, counts and summary statistics, sequence detection, sequence comparison, sequence characterization, visualization, and integrated support constitute the elements of the comparison framework. Each element is briefly described, examples are given, related work is identified where appropriate, and strengths and limitations are pointed out. Authors note that "Synch and search techniques are among the most mature technologies for exploiting UI event data in usability evaluations" (p. 414). They found that very few approaches support transformation, a critical subprocess to meaningful use of extracted usability information. In addition, they report that the more compelling techniques (e.g., exploratory sequential data analysis and Markov- and Grammar-based sequence characterization techniques) require the most human intervention whereas the most automated techniques tend to be less compelling and most unrealistic in their assumptions. Future work is recommended in the areas of capturing and evaluating event data in software testing and debugging, of mapping between lower and higher level events of interest, of automated discovery and validation of patterns in large collections of event data, and in domains such as identifying, diagnosing, and repairing breakdowns in complex system operations.

Access critical reviews of Computing literature here

Become a reviewer for Computing Reviews.

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in

Full Access

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader