Published in

Wiley Open Access, Ecology and Evolution, 19(5), p. 4451-4454, 2015

DOI: 10.1002/ece3.1722

Links

Tools

Export citation

Search in Google Scholar

Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology

Journal article published in 2015 by Neal R. Haddaway ORCID, Jos Ta A. Verhoeven
This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeata-bility of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (AE8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeata-bility as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.