Published in

Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 1(55), p. 1165-1169

DOI: 10.1177/1071181311551243

Links

Tools

Export citation

Search in Google Scholar

Perceptions of Temporal Synchrony in Multimodal Displays

Journal article published in 2011 by Wayne Giang, Catherine Marie Burns ORCID, Ehsan Masnavi
This paper is available in a repository.
This paper is available in a repository.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Current multimodal interfaces make use of several intra-modal perceptual judgements that help users “di-rectly perceive” information. These judgements help users organize and group information with little cog-nitive effort. Cross-modal perceptual relationships are much less commonly used in multimodal inter-faces, but could also provide processing advantages for grouping and understanding data across different modalities. In this paper we examine whether individuals are able to directly perceive cross-modal auditory and tactile temporal rate synchrony events. If direct perception is possible, then we would expect that indi-viduals would be able to correctly make these judgements with very little cognitive effort. Our results indi-cate that individuals have difficulty identifying when the temporal rates of auditory and tactile stimuli in a monitoring task are synchronous. Changes in workload, manipulated using a secondary visual task, resulted in changes in performance in the temporal synchrony task. We concluded that temporal rate synchrony is not a perceptual relationship that allows for direct perception, but further investigation of cross-modal per-ceptual relationships is required.