Abstract
This paper presents the results of a study that compared three think-aloud methods: concurrent think-aloud, retrospective think-aloud, and a hybrid method. The three methods were compared through an evaluation of a library website, which involved four points of comparison: task performance, participants' experiences, usability problems discovered, and the cost of employing the methods. The results revealed that the concurrent method outperformed both the retrospective and the hybrid methods in facilitating successful usability testing. It detected higher numbers of usability problems than the retrospective method, and produced output comparable to that of the hybrid method. The method received average to positive ratings from its users, and no reactivity was observed. Lastly, this method required much less time on the evaluator's part than did the other two methods, which involved double the testing and analysis time.
Original language | English |
---|---|
Title of host publication | CHI '18 Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems |
Publisher | Association for Computing Machinery (ACM) |
Pages | 1-12 |
ISBN (Print) | 978-1-4503-5620-6 |
DOIs | |
Publication status | Published - 1 Apr 2018 |
Event | 2018 ACM Conference on Human Factors in Computing Systems (CHI'18): Engage with HCI - Palais des Congrès , Montreal, Canada Duration: 21 Apr 2018 → 26 Apr 2018 https://chi2018.acm.org/ |
Conference
Conference | 2018 ACM Conference on Human Factors in Computing Systems (CHI'18) |
---|---|
Abbreviated title | CHI 2018 |
Country/Territory | Canada |
City | Montreal |
Period | 21/04/18 → 26/04/18 |
Internet address |
Keywords
- Usability testing
- user studies
- user experiences
- think-aloud protocols
- human-computer interaction