Dissemin is shutting down on January 1st, 2025

Published in

JMIR Publications, Journal of Medical Internet Research, (25), p. e45764, 2023

DOI: 10.2196/45764

Links

Tools

Export citation

Search in Google Scholar

Evaluating the Effects of Rewards and Schedule Length on Response Rates to Ecological Momentary Assessment Surveys: Randomized Controlled Trials

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Background Ecological momentary assessments (EMAs) are short, repeated surveys designed to collect information on experiences in real-time, real-life contexts. Embedding periodic bursts of EMAs within cohort studies enables the study of experiences on multiple timescales and could greatly enhance the accuracy of self-reported information. However, the burden on participants may be high and should be minimized to optimize EMA response rates. Objective We aimed to evaluate the effects of study design features on EMA response rates. Methods Embedded within an ongoing cohort study (Health@NUS), 3 bursts of EMAs were implemented over a 7-month period (April to October 2021). The response rate (percentage of completed EMA surveys from all sent EMA surveys; 30-42 individual EMA surveys sent/burst) for each burst was examined. Following a low response rate in burst 1, changes were made to the subsequent implementation strategy (SMS text message announcements instead of emails). In addition, 2 consecutive randomized controlled trials were conducted to evaluate the efficacy of 4 different reward structures (with fixed and bonus components) and 2 different schedule lengths (7 or 14 d) on changes to the EMA response rate. Analyses were conducted from 2021 to 2022 using ANOVA and analysis of covariance to examine group differences and mixed models to assess changes across all 3 bursts. Results Participants (N=384) were university students (n=232, 60.4% female; mean age 23, SD 1.3 y) in Singapore. Changing the reward structure did not significantly change the response rate (F3,380=1.75; P=.16). Changing the schedule length did significantly change the response rate (F1,382=6.23; P=.01); the response rate was higher for the longer schedule (14 d; mean 48.34%, SD 33.17%) than the shorter schedule (7 d; mean 38.52%, SD 33.44%). The average response rate was higher in burst 2 and burst 3 (mean 50.56, SD 33.61 and mean 48.34, SD 33.17, respectively) than in burst 1 (mean 25.78, SD 30.12), and the difference was statistically significant (F2,766=93.83; P<.001). Conclusions Small changes to the implementation strategy (SMS text messages instead of emails) may have contributed to increasing the response rate over time. Changing the available rewards did not lead to a significant difference in the response rate, whereas changing the schedule length did lead to a significant difference in the response rate. Our study provides novel insights on how to implement EMA surveys in ongoing cohort studies. This knowledge is essential for conducting high-quality studies using EMA surveys. Trial Registration ClinicalTrials.gov NCT05154227; https://clinicaltrials.gov/ct2/show/NCT05154227