Synthetic animation of deaf signing gestures

Research output: Chapter in Book/Report/Conference proceedingChapter

29 Citations (Scopus)

Abstract

We describe a method for automatically synthesizing deaf signing animations from a high-level description of signs in terms of the HamNoSys transcription system. Lifelike movement is achieved by combining a simple control model of hand movement with inverse kinematic calculations for placement of the arms. The realism can be further enhanced by mixing the synthesized animation with motion capture data for the spine and neck, to add natural "ambient motion".
Original languageEnglish
Title of host publicationGesture and Sign Language in Human-Computer Interaction
EditorsIpke Wachsmuth, Timo Sowa
PublisherSpringer Berlin / Heidelberg
Pages146-157
Number of pages12
Volume2298
ISBN (Print)978-3-540-43678-2
DOIs
Publication statusPublished - 2001
EventInternational Gesture Workshop - London, United Kingdom
Duration: 18 Apr 200120 Apr 2001

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Berlin / Heidelberg

Workshop

WorkshopInternational Gesture Workshop
Country/TerritoryUnited Kingdom
CityLondon
Period18/04/0120/04/01

Cite this