Published in

Public Library of Science, PLoS ONE, 12(18), p. e0295033, 2023

DOI: 10.1371/journal.pone.0295033

Links

Tools

Export citation

Search in Google Scholar

The Jena Eyewitness Research Stimuli (JERS): A database of mock theft videos involving two perpetrators, presented in 2D and VR formats with corresponding 2D and 3D lineup images

Journal article published in 2023 by Ulrike Kruse ORCID, Stefan R. Schweinberger ORCID
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Empirical investigations into eyewitness identification accuracy typically necessitate the creation of novel stimulus materials, which can be a challenging and time-consuming task. To facilitate this process and promote further research in this domain, we introduce the new Jena Eyewitness Research Stimuli (JERS). They comprise six video sequences depicting a mock theft committed by two different perpetrators, available in both two-dimensional (2D) and 360° format, combined with the corresponding lineup images presented in 2D or three-dimensional (3D) format. Images of one suspect and eight fillers are available for each lineup. We evaluated lineup fairness by using mock eyewitness paradigm and noted a Tredoux’s E of 4.687 for Perpetrator 1 and 5.406 for Perpetrator 2. Moreover, no bias towards the perpetrators was observed in the lineups. We incorporated 360° videos and 3D lineup images to encourage the adoption of innovative data formats in experimental investigations of eyewitness accuracy. In particular, compatibility with Virtual Reality (VR) makes JERS a promising tool for advancing eyewitness research by enabling researchers to construct controlled environments that offer observers an immersive experience. JERS is freely accessible for the use of academic purposes via the Open Science Framework (OSF).