Published in

2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)

DOI: 10.1109/icassp.2011.5947686

Links

Tools

Export citation

Search in Google Scholar

Associating children's non-verbal and verbal behaviour: Body movements, emotions, and laughter in a human-robot interaction

Proceedings article published in 2011 by Anton Batliner, Stefan Steidl, Elmar Nöth ORCID
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

In this article, we associate different types of vocal behaviour denoting emotional user states and laughter with different types of body movements such as gestures, forward bends, or liveliness. Our subjects are German children giving commands to Sony's Aibo robot; the data are fully realistic. The analysis reveals characteristic and significant co-occurrences of body movements and vocal events.