Published in

Frontiers Media, Frontiers in Psychology, (13), 2022

DOI: 10.3389/fpsyg.2022.973164

Links

Tools

Export citation

Search in Google Scholar

Musical expertise shapes visual-melodic memory integration

Journal article published in 2022 by Martina Hoffmann, Alexander Schmidt, Christoph J. Ploner ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Music can act as a mnemonic device that can elicit multiple memories. How musical and non-musical information integrate into complex cross-modal memory representations has however rarely been investigated. Here, we studied the ability of human subjects to associate visual objects with melodies. Musical laypersons and professional musicians performed an associative inference task that tested the ability to form and memorize paired associations between objects and melodies (“direct trials”) and to integrate these pairs into more complex representations where melodies are linked with two objects across trials (“indirect trials”). We further investigated whether and how musical expertise modulates these two processes. We analyzed accuracy and reaction times (RTs) of direct and indirect trials in both groups. We reasoned that the musical and cross-modal memory demands of musicianship might modulate performance in the task and might thus reveal mechanisms that underlie the association and integration of visual information with musical information. Although musicians showed a higher overall memory accuracy, non-musicians’ accuracy was well above chance level in both trial types, thus indicating a significant ability to associate and integrate musical with visual information even in musically untrained subjects. However, non-musicians showed shorter RTs in indirect compared to direct trials, whereas the reverse pattern was found in musicians. Moreover, accuracy of direct and indirect trials correlated significantly in musicians but not in non-musicians. Consistent with previous accounts of visual associative memory, we interpret these findings as suggestive of at least two complimentary mechanisms that contribute to visual-melodic memory integration. (I) A default mechanism that mainly operates at encoding of complex visual-melodic associations and that works with surprising efficacy even in musically untrained subjects. (II) A retrieval-based mechanism that critically depends on an expert ability to maintain and discriminate visual-melodic associations across extended memory delays. Future studies may investigate how these mechanisms contribute to the everyday experience of music-evoked memories.