Published in

Cambridge University Press, International Journal of Technology Assessment in Health Care, 2(27), p. 159-168, 2011

DOI: 10.1017/s0266462311000018

Links

Tools

Export citation

Search in Google Scholar

Coverage with evidence development: The Ontario experience

This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Red circle
Postprint: archiving forbidden
Red circle
Published version: archiving forbidden
Data provided by SHERPA/RoMEO

Abstract

Background:For non-drug technologies, there is often residual uncertainty following systematic review, mainly due to inadequate evidence of efficacy. The unwillingness to make decisions in the presence of uncertainty may lead to passive diffusion and intuitive decision making with or without public pressure. This may affect health system sustainability. There is increasing interest in post-market evaluation through processes that include coverage with evidence development (CED) to address residual uncertainty regarding effectiveness and cost-effectiveness. Global experience of CED has been slow to develop despite their potential contribution to decision making.Methods:Ontario's field evaluation program to better inform decision making represents a collaboration between physicians, policy decision makers and academic centers. We report results of the first ten CEDs from this program to assess whether they achieved their objective of influencing policy by addressing residual uncertainty following systematic review.Results:Since 2003, nineteen field evaluation studies to resolve residual uncertainty following systematic review have been completed, ten of which met the criteria of CED and are the focus of this report. There was more than one patient subgroup or intervention in three of the CEDs. This provided the basis for evaluating thirteen outcomes. In each case, the CED addressed the uncertainty and led to a decision based on the systematic review and CED result. The CEDs led to adoption of the technology in six instances, modified adoption in three instances and withdrawal in four instances.Conclusions:CED makes an important contribution to translating evidence to decision making. Methodologies are needed to increase the scope and reduce timelines for CEDs, such as the use of linked comprehensive and robust data sets and collaborative studies with other jurisdictions. CED before making long-term funding decisions, especially where there is uncertainty of effectiveness, safety or cost-effectiveness, should be increasingly funded by health systems.