Taylor and Francis Group, Clinical Linguistics & Phonetics, 8-10(29), p. 748-763, 2015
DOI: 10.3109/02699206.2015.1036462
Full text: Download
A significant body of evidence from cross-linguistic and developmental studies converges to suggest that co-speech iconic gesture mirrors language. This paper aims to identify whether gesture reflects impaired spoken language in a similar way. Twenty-nine people with aphasia (PWA) and 29 neurologically healthy control participants (NHPs) produced a narrative discourse, retelling the story of a cartoon video. Gesture and language were analysed in terms of semantic content and structure for two key motion events. The aphasic data showed an influence on gesture from lexical choices but no corresponding clausal influence. Both the groups produced gesture that matched the semantics of the spoken language and gesture that did not, although there was one particular gesture-language mismatch (semantically "light" verbs paired with semantically richer gesture) that typified the PWA narratives. These results indicate that gesture is both closely related to spoken language impairment and compensatory.