BioMed Central, Implementation Science, 1(9), 2014
DOI: 10.1186/s13012-014-0113-0
Full text: Download
Background: Process evaluation is vital for understanding how interventions function in different settings, including if and why they have different effects or do not work at all. This is particularly important in trials of complex interventions in ‘real world’ organisational settings where causality is difficult to determine. Complexity presents challenges for process evaluation, and process evaluations that tackle complexity are rarely reported. This paper presents the detailed protocol for a process evaluation embedded in a randomised trial of a complex intervention known as SPIRIT (Supporting Policy In health with Research: an Intervention Trial). SPIRIT aims to build capacity for using research in health policy and program agencies. Methods: We describe the flexible and pragmatic methods used for capturing, managing and analysing data across three domains: (a) the intervention as it was implemented; (b) how people participated in and responded to the intervention; and (c) the contextual characteristics that mediated this relationship and may influence outcomes. Qualitative and quantitative data collection methods include purposively sampled semi-structured interviews at two time points, direct observation and coding of intervention activities, and participant feedback forms. We provide examples of the data collection and data management tools developed. Discussion: This protocol provides a worked example of how to embed process evaluation in the design and evaluation of a complex intervention trial. It tackles complexity in the intervention and its implementation settings. To our knowledge, it is the only detailed example of the methods for a process evaluation of an intervention conducted as part of a randomised trial in policy organisations. We identify strengths and weaknesses, and discuss how the methods are functioning during early implementation. Using ‘insider’ consultation to develop methods is enabling us to optimise data collection while minimising discomfort and burden for participants. Embedding the process evaluation within the trial design is facilitating access to data, but may impair participants’ willingness to talk openly in interviews. While it is challenging to evaluate the process of conducting a randomised trial of a complex intervention, our experience so far suggests that it is feasible and can add considerably to the knowledge generated. Keywords: Process evaluation, Complex intervention, Implementation, Knowledge exchange, Health policy, Organisational change, Capacity building, Qualitative methods, Developmental evaluation, Framework analysis ; SPIRIT is funded as part of the Centre for Informing Policy in Health with Evidence from Research (CIPHER), an Australian National Heal th and Medical Research Council Centre for Resear ch Excellence (APP10014 36) and administered by the Univers ity of Western Sydney. The Sax Institute receives a grant from the NSW Ministry of Health. The Australasian Coch rane Centre is funded by the Australian Government through the National Health and Medical Resear ch Council (NHMRC). SC is supported by an NHMRC Career Development Fellows hip (1032963). DO holds an NHMRC Public Health Fellowship (606726 ). DO is an Associate Editor of Implemen tation Science. All editorial decisions regarding this manuscript were made by another editor. ; VELIM