Effects of damping head movement and facial expression in dyadic conversation using real - time facial expression tracking and synthesized avatars

Steven M. Boker, Jeffrey F. Cohn, Barry-John Theobald, Iain Matthews, Timothy R. Brick, Jeffrey R. Spies

Research output: Contribution to journalArticlepeer-review

58 Citations (Scopus)


When people speak with one another, they tend to adapt their head movements and facial expressions in response to each others' head movements and facial expressions. We present an experiment in which confederates' head movements and facial expressions were motion tracked during videoconference conversations, an avatar face was reconstructed in real time, and naive participants spoke with the avatar face. No naive participant guessed that the computer generated face was not video. Confederates' facial expressions, vocal inflections and head movements were attenuated at 1 min intervals in a fully crossed experimental design. Attenuated head movements led to increased head nods and lateral head turns, and attenuated facial expressions led to increased head nodding in both naive participants and confederates. Together, these results are consistent with a hypothesis that the dynamics of head movements in dyadicconversation include a shared equilibrium. Although both conversational partners were blind to the manipulation, when apparent head movement of one conversant was atte
Original languageEnglish
Pages (from-to)3485-3495
Number of pages11
JournalPhilosophical Transactions of the Royal Society B
Publication statusPublished - 2009

Cite this